Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
13:00 the reason this contradiction exists is because the AI doesn't have a consciousness or a memory. The contradiction is only suprising if you mistakenly believe that the chatbot has a mental model or has some form of memory or state, which it doesn't. all it does is predict what words most often come after other words.
youtube AI Harm Incident 2026-01-17T01:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxBqO0R8QuL8Ml_x5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzPs7737AsV0Bg0B054AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzcSdIJsa9cLxjBXeF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxovrNLRtUhcrXY1tR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxqtUHXr7K2MssRzUx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzeb3D813Za7H3OdRV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgywaBVI4QJ6KmncnSl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwQAotZBA51NnHwqft4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugz1k7G6D3GCWVRzXNh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwQF739Oht0zaarH454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]