Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That one robot gave the camera the most bombasticyest side eyes in all of histor…
ytc_Ugyn55qH7…
G
13:25 I’m at that exact same spot as he is…
I don’t even know what to tell my b…
ytc_Ugw9bBKr7…
G
As for Random Science recheck if you found in guilty . RS probably the best algo…
ytc_UgzwEoBYN…
G
The title should read “AI doomsday scenarios are overrated” as Dr Tyson is very …
ytc_Ugzbu9D1h…
G
The problem is,,, ONLY A.I CAN COUNTER A I .. Iran, China, U.S., Russia.......…
ytc_UgyD5ZeCU…
G
Mr. Hinton as many others uses undefined terms like smarter , more powerful and …
ytc_UgwOADTur…
G
i’m so thankful no one can “OWN” an ai artpiece cause it just gets rid of so muc…
ytc_Ugx204Ajv…
G
AI could take over the world in less than a second, and the people making it are…
ytc_Ugx80adyy…
Comment
About the final note... Actually, AI is not - and will not be - helping me to do my job better. It helps unqualified people to do my qualified job for less money, so all I get is to be paid less.
Then, it implies that "doing my job better" serves someone else, namely the employer, who gets more money by not paying me for doing a better job.
Either way, replacing the human intelligence and human decision it's literally replacing humans from society, thus the very concept of "society" has no meaning under that premise.
youtube
AI Moral Status
2026-02-28T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyLhoiOhgroV7u43s94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0EX4_OR31PUEk3Pt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyfFXX0LFQekUjYkOd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyl2QqbcZ2zdvoi-wZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw8DjDYdy1OdX5nWuB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwK_A8wx7TFptnXRM94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZgDmRlFlaC0C3MF14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzDCYoIL3QMOlj6TlJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxbPzinpOXrdHmcYKZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqbTcKIcrWwqATJeR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]