Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We're not just "training" the AI; our interactions train how we interact with others. If we train ourselves to be rude to electronic servitors, we can't just switch that off when we deal with flesh-and-blood humans. When you've trained yourself to bark commands or be rude to these computerized servants, it will come out when we deal with our fellow human beings. Kindness costs nothing.
youtube AI Moral Status 2026-03-11T13:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugz6qf3Z1h4JRlRXMFZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwZeqUmjqeZj7IlUHZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxJglQ8FfSTBW3LYkB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-Hjea3Ov4UCcZ6DR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgxAfQGU1SlOvCaiPah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwcWRXKJGxwgASULX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx-hhvgsRHkQ1pLmK54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgzAVaEGOp0DYmzJWo14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy5LAY73S9lP_7B9Zh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxaO5WMBoIXKBj4-cF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]