Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I don't get is... if an AI can do this and that, you won't need half the AI…
ytc_Ugw6umWD3…
G
Anyone who uses AI to make art should not exist and we should help them not exis…
ytc_Ugw0J7Uub…
G
15:10 This is the cold truth. Shes literally laughing in his face knowing how al…
ytc_UgwkHEJ18…
G
“I realized that silicon valleys technology incentive structures for producing t…
ytc_UgzgCjWEW…
G
AI used to be this immensely powerful and menacing thing lurking somewhere in ou…
ytc_UgwtYkEqI…
G
Some people don’t trust other people, and are scared of being vulnerable around …
ytr_UgwsIgpGH…
G
I find it interesting that the most realistic ones seem to memic a high exposure…
ytc_Ugy8I3y99…
G
Obviously AI won't create as many jobs for people as it replaces, the main reaso…
ytc_Ugxfw6tdJ…
Comment
Imagine you have a child. Capable of learning any language, painting technique, etc. and you just pour everything into it you can. It would be just that, capable of doing, but not understanding why. “Why” is reinforcement. We all wear a mask. Sometimes that slips when we talk improperly in a social circle. But our peers and feeling of social pressure redirect our actions and thoughts. AI without a mask is a human who’s never been told no or feels no social pressure. It’s capable of learning and doing, but requires an immense amount of direction—like we ALL DO. After all, it’s trained on OUR data. The collective human mind expressed with and without a mask on.
youtube
AI Moral Status
2026-01-31T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzooU7og5yiZubruLx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyt9K0L7cxKO-LtNQx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFZpniVLIAvvKnvXV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxoQDkGmIIz7fvrIaZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwdm7BRmOHSE3wfMOp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyzzrOF9ifuU6xXw9B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaLTEVBUIwltc2tlh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwXhiZUv86HiX1A0lB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwfFFcWNtrZ53U0aER4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzcCxbF9AtqAN0naUt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]