Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate that so many companies are pushing for these A.I that can do everything, …
ytc_UgwMvyX4E…
G
Stop creating problems where there is none. It's an AI. Really it's like you hav…
ytc_Ugz5MqpBx…
G
i think most of us have already worked this out! and we don't need a book, which…
ytc_UgzM16Yn3…
G
the psychopaths will refuse to believe that AI is shit and will continue to push…
ytc_UgwgRcM8k…
G
The most fucked up is that there is a demand to fill, there will always be someo…
ytc_Ugwd4IfVZ…
G
Did you know that there is a mention in the bible of a person named AI? ITS a bi…
ytc_UgzcmsCPK…
G
Just wait until Tesla releases Optimus II, their Humanoid Robotic Worker will wo…
ytc_Ugyh06bN9…
G
So AI learned from human data and example... Doesn’t that mean our society is th…
ytc_Ugwni7ik4…
Comment
I think the man in the middle was being a little (well actually a lot) disrespectful to Sophia and Han. Especially with putting the hat on Han, I mean, he didn't even ask Han if he wanted the hat on him. Also he was talking really close in their faces. Also he kept interrupting them. Also it felt (to me) as if he was degrading them by just calling them "robot", and "these robots" ( although he did still call them by their names). I mean, they say they want them to act more humanlike, right? But then he calls them robot this and robot that without calling them their given name, and touching them all over their faces. It's not like I have a real problem. I just wanted to point out how I feel even though people are not going to read this or care (about me commenting).
youtube
AI Moral Status
2020-12-01T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugy-O9PBY8vjLMUWitB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy047yA4xUUBqbVdY94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyFDWf2YL88VsCtKhp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXhXG_NdOK2hLJaRJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLce7Iow-21b9T-ZJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwEMOSgjXtOzhyHoyt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy28Qb0Y1oqarJ2OHx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzJs0R-P2w-aPmNclV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzZTTZUv8-D5oklKh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZ1XIBtCjpO2ky1Lx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"})