Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What AI tech are we talking about here? If it’s LLM based, this guy does not seem to know what he’s talking about. If it’s something else, please say what it is. LLM’s do not have thoughts
youtube AI Moral Status 2025-11-03T16:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzn4o4kur6Mq40hp8J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyZYMYBBuYPpyMbEsx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugyydiy5p7thZnzDyLN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwJkGKA1HuK8JYFSAx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw8bLDwfL6RPvTyPPV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxjXAlvQdUVYZMGGJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwaw-i9NShXvt0dDwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzbp_kNVUdGzWnwPA94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwa4AHICQA_czmXW-N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxjc_l5VyqsVvq9kXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]