Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know this is supposed to be funny but I think this is insensitive. As a medica…
ytc_UgyedlgCE…
G
Yayy I ordered the Reject A.I. poster! I hope it gets here alright. Glad to supp…
ytc_UgyChH8bJ…
G
I feel like a lot of people think art is purely about the result and how the art…
ytc_UgxgECSOK…
G
I cant believe that this is even a question. We all know that ai is a mirror, no…
ytc_UgwKrvLxg…
G
That's insane for the first time in his life Elon Musk doesn't have an answer to…
ytc_UgzlEPGB8…
G
You've done it again! Very thought provoking. I somehow think it it would be dif…
ytc_Ugwfdj0f1…
G
I’ll be totally honest (as a mediocre artist myself) I do not see where the AI a…
ytc_Ugx6Whjaa…
G
In a world where the AI controls everything, the society we know today will spli…
ytc_Ugx3L7oHz…
Comment
I'm going to be very pissed if robots start taking jobs away. And if they become self aware, and learn to be human or beyond the intelligence of humans, wouldn't that mean that they would want money for the work they do? That being said, I can imagine a robot living in it's own house, waking up, driving to work, coming home, and doing it all again. Would robots be able to pick up bad habits? Like drinking? Would they become alcoholics working paycheck to paycheck?
youtube
AI Moral Status
2020-06-01T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw821E_hH-LLu5ap-V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxDWcZ6UVuFhnPh0N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9VBdClRge0S9bR294AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxXCaOoi3M93DlZrx94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwIIK8o29hv7iwSyNd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyU1HPQuNAELC5vNDl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxmJfT8gY6sfpZRPBF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzKdteLwelTkUiEEyl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwlTbmceaQ7XDOXqzV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwyPeIZnm8_dX8md9l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]