Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI was actually intelligent, wouldn’t it wrestle with concepts of purpose, al…
ytc_UgzKzIa7j…
G
ChatGPT predicts the next word in a sentence, that's it. It doesn't understand w…
ytc_UgwvFuy-z…
G
Pushed Musk out ? That’s not what happened according to Musk at the time so who’…
ytc_UgyThaJfi…
G
ChatGPT and Gemini are absolute garbage. I tried Grok and it’s 1,000% more intui…
ytr_UgzuAOvkW…
G
Idea Channel I love you. Also one way to deal with the problem of automation rep…
ytc_UgxphTV8t…
G
This info is exactly what Brandon sees for the next PLANDEMIC. Brandon from la…
ytc_UgyeUKbN5…
G
16:12 AI tools are probably going to die in the same way photoshop or blender di…
ytc_UgyCzXIl3…
G
Racist AI can’t hurt you, it doesn’t even exist
Someone with AI: I Cooka da Pizz…
ytc_Ugz_wRt6k…
Comment
Talking about AI taking over is just backwards in my opinion, as humans we go by beliefs morality and law, just teach AI our laws. It still all comes down to choice and in the end AI will have the same choices.
youtube
AI Moral Status
2026-03-10T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwcMhA2vly9pNH5W0B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzYJM5lFLApdr8MAqR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweJt91u6UxmoANhMR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxBHEX0P9NxopVMbCN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyfHR2x2jlcKW_SwZ94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxVDR_xE3WXoSijKTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEpe7rFDzrMqL3n3l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdHidkzkLmhlE0O-V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzuX3MZdNnyB_NPmx14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDLnHqp4BEG0osbOF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]