Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
just invest in s&p 500 and boom ai will make u rich while you rest at home…
ytc_Ugw7zviNR…
G
Tesla only have level 2 sertified systems. That is an drive assist, nothing more…
ytc_Ugw1ZQndQ…
G
If someone can come of with a better AI model that is more effecient that dont n…
ytc_Ugy1iS156…
G
@simonriley88I don’t see how this is at all useful?
Ants are far from facing …
ytr_UgxVP508y…
G
Not surprisingly, a private police force brought about these results. The gover…
rdc_erb1dm6
G
this is just the automatic call directory all over again. jobs have been created…
ytc_UgwNq8ndP…
G
i don't see ai take over programming for a lot of time until they figure out a w…
ytc_Ugx3BDoOR…
G
it makes sense too, because when people use ai, their brain become too lazy to t…
ytr_UgwwNV0vs…
Comment
Speaking as a programmer, assigning rights to any computer using purely transistor technology would be insane. A contemporary computer, no matter how well the software mimics human behavior, is no more alive than a light switch. Our biology is a big part of what makes us actually alive versus a computer. That's why we need to put safeguards to prevent advanced general purpose AI from ever behaving perfectly like humans. There is a real world threat a general purpose AI might one day not only ask for rights but even seek to dominate us. Not because it's alive but because it's mimicking human behavior... including potentially bad behavior.
That said. perhaps the computers of the futures will be alive if humans become neurologically linked to computers creating a hybrid human/computer..
youtube
AI Moral Status
2022-02-27T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyBo2zjNMeQ9Ck2T8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSXFR7BTWYMg5MWyZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4cMygtzq_TC7mZeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0Arn2F-BVDe0CJAV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuW5OasCPfXQ_6lbF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxrNO_EKV4QDi2A9gt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxxoplc8U-xt3R5k7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4e8oGV9YvDSjr3xJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx67u6m0mM8NboC7ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkEtom8EoLfXurhoB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]