Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Has anyone gone into their bank to get customer service and the human behind the…
ytc_UgzcAdUTv…
G
One huge thing that is normally overlooked is emotions, if we truely decide to m…
ytc_UgxZln4YR…
G
it's sad thst artists want big corporations to own AI generated world and regul…
ytc_Ugwy2gvAF…
G
I hope youre reading this. Charlie, as an artist myself ill say it: you're a gre…
ytc_UgxKVxouF…
G
All the arguments that the robotaxi didn't face the correct way before starting …
ytc_UgyDFabct…
G
I really hate the argument that the ai people make. All of the arguments. All to…
ytc_UgyetwOS2…
G
Yall don’t seriously think that giving AI those parameters is going to reveal Th…
ytc_Ugz1OAl5c…
G
AI will not take over the world. It's biggest threat is that people will become …
ytc_UgwFD_GTz…
Comment
Why won't AGI take our jobs? His example was horses to cars for something that was literally designed to do things better than us in every aspect. In that example we are the horse you Ding Ding....
Scientists just love the exploration part but don't stop to think if their blankets will wipe out the indigenous population.
Maybe some bad actors? You should always worry about bad actors when inventing new tech. With AI, you only need 1 bad actor, like the one in the pentagon threatening an AI company to GET RID OF THEIR SAFEGUARDS OR LOSE THEIR DEFENSE CONTRACT.
youtube
AI Moral Status
2026-03-06T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyaz_hvwObSMuzbfKV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrICcDn7vRFqggLal4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-qXb3EKLypGbONZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDBzgIG8Y4FKcipSR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxg66iSKtXZVYdvded4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzmfez6Hvr3pxCeHPZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxuPj0YUz0zOajdW1h4AaABAg","responsibility":"scientists","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxykeSxrWNdFtiCQ1Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwM6irhMPNGKDsgkp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugws4zQUJ7yGaCwNMtJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]