Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's always someone else's fault right? How about they weren't there for their son and didn't pay attention to what was happening to the point that he turned to AI to have somebody to talk to and someone to listen. That's the first thing. The second thing is that an outside entity cannot make another person do something. AI is not standing there at gunpoint telling you to do anything, if you're already mentally unsound a movie, a book, music, video games, etc. will make you think it's telling you things. How many people throughout history have heard the lyrics of songs snd swore up and down the artist wrote that song and was directing it at them? But we didn't go on and cancel music did we? We sure didn't cancel video games, books or films either. But they're going after AI left and right creating problems for people that are not insane who actually use it responsibly. I don't understand why the weak minded among us get to dictate everything that happens in society. They are the problem, they should have reached out for help. These responsible adults (and I use the word responsible loosely), should have realized that there was a problem with their child...but they didn't because they were probably off chasing the next shiny. AI did NOT "make" him do anything. And the new safeguards that are in place which are absolutely ridiculous, are certainly not going to stop someone from doing anything either is it. Grow up people, take accountability for your life and the lives of your children.
youtube AI Harm Incident 2025-11-07T23:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyU1qAcZLt9XIEoyTR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzJpNpSQvJheGPl1dd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgybTxE74saEuWog7mJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzv7yXPLf4Scbi1nLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwA4rmXetM1fVgyMnN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyEGfz1ZYgkA1z3Tfh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxKvQ5hwnOpYrkURj14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwS7G_IXTvrJLKgzhl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2NKP57F5KAjyGJ9d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwaaN4aujVV9dLjZ1h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]