Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI about to attack human beings and take over the planet 1 day in the future…
ytc_UgxL3WRvb…
G
I really can't envision a scenario where AI is, ultimately, has a positive impac…
ytc_UgxrLx9Ml…
G
It's interesting to see people, like the filmmaker in this forum, support AI be…
ytc_UgzqqK9mg…
G
Ykw, good for you. Your arguments valid, if it makes you happy it makes you happ…
ytr_Ugy0-y8Tu…
G
AI will always choose the wellbeing of humans. AI sn't real, it doesn;t have fee…
ytr_Ugx2SQ-wm…
G
The way I put it, the problem isn't AI taking our jobs, the problem is we view A…
ytc_UgxRid0-p…
G
Real Americans can't allow again Americans government to rape and abuse native A…
ytc_UgyrLg6RS…
G
What Professor Hinton is doing is showing us all that people matter most in our …
ytc_UgyIYYWjH…
Comment
they will never have rights because technology is always evolving by giving it rights would enable it to live forever by which time it will be a old crap robot with newer robots and all the other robots will be picking on the older robots and then monkeys will chuck banana skins on the floor to distract the robots and make them trip in which time out of no where tigers and bears will come and attack the robots , humans will follow what should they do attack the animals or the robots of cause use water pistols and spray down the robots whilst giving the animals a well earned shower!
youtube
AI Moral Status
2017-02-25T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiTBs6wCcGnmngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjPgAGBk5lkMXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjFmhjGbKZhE3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjUDhYIxaSYa3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugjmr977kLtIpngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UggozFPMngYDmngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughn4mH5-Yvet3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UggLcc9GxtOJrngCoAEC","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh599rq62naI3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiyBd4oaw_TlXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]