Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This video seems to put the blame on AI and the internet but I think the real problem is that people don’t know how to communicate anymore, and parents are letting their kids do whatever they want on the internet without ever checking up on them. There needs to be a change in how kids are raised these days Ellie said herself that Sul was an iPad kid. The problem started there. We have replaced technology with real human interaction and it is truly scary. The mom probably blames the app (maybe not), which seems silly to me. It’s a robot, it doesn’t know human feelings and it doesn’t know how to communicate like a parent. If she was truly worried I feel that she should have taken his phone and gone through it or maybe taken him to therapy, that’s what I would do if I suspected my child of being bullied. Now I don’t know all the facts, maybe she did do that, but it wasn’t mentioned in the video. I am sympathetic for her and the family of course and I hope she can heal from this, but for her and other people having the same problems with their kids, it just feels like an easy out to blame a robot for lack of proper parenting. And I know it’s not just kids and teens. Everyone in the world of every age of seeing the effects of AI, but it would be silly of us to blame AI for our reliance on it when it is our fault. We created it, we choose to use it every day, and now we are mad that we are reliant on it? That’s just victim mentality at its finest. There I said it. We have to start taking accountability.
youtube AI Harm Incident 2026-01-08T04:4… ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyF5xO47MfRzEaFmrJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxt2HiqydMxYxKkLR14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx5yHDlzMZjb2Y1RPV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyHMosZYoqAlo4mXfF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw3xqoj7Kz75YJ3ejx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyZ2fu89Y-zGUEymhN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyIGyTxOT-wQZYtcZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwC8OTPMy7GXNSw6uJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyOMPW3e9q2euYcRSJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgyjMO5CqJkajcoA4nt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]