Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with less accidents in self driving cars is control. I wouldn't be p…
ytc_UgxNeABeE…
G
How does the first clip look real to you guys? I knew it was ai a mile away cuz …
ytc_UgwvmdoKp…
G
AI generated content will go into the hopper and eventually be used to train mor…
ytc_Ugxn5XhyL…
G
AI adoption is a social evolution.. Everybody has to play their part to maintain…
ytc_Ugxrxridu…
G
Faster and efficient is good tho. That's what we have worked for for so many yea…
ytr_Ugxhl0Rn_…
G
my stupid gurok cant even give me correct pricing for a cpu or give me correct a…
ytc_UgwaBbbOD…
G
Please don’t rely upon CHATGPT to make decisions for medical diagnosis. Because …
ytc_UgzkCWTS4…
G
That’s a great question but in reality no one can predict the future. In my opin…
ytr_UgweGEUcy…
Comment
Every book and movie that involves AI ends at best with dependency (Star Trek episode The Custodian, oh the eternal optimism of Gene Roddenberry) to human subjugation or extension (pretty much everyone else). Points to Wayland’s David for perfecting the Xenomorphs, engineering an organic species to destroy us was so extra.
youtube
AI Moral Status
2025-07-15T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwq8QW_mwKSTJlfyaV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxp915E3CrNKgUnbJ14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgsOYgwZxTeQUIsUN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy3GFaCNPjWyrJchtJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwPaAksvEWIMmOCeg14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxX38JMEccolHuZryN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzXo7iMckFj4EyJ8mB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyDzU5mXJeG6JrT7ix4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHSZHl0Jujg2wyStJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwg1Ik91y8rRiiu2CB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]