Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You must pay in order to use the website. I just made an account out of curiosity, and until you pay, you can't do sh3t. Just like Tinder, Bumble, etc. The b3tch decided to be friends with the kids instead of educating them, and the kid got access to the credit card from one of them, and now you get this. It's called weak parenting and irresponsible adults. And the website is just annoying; I just made an account on Character AI, and most of the features require a subscription and some form of credits. And yes, I am chatting right now with an AI called Kiri, after that character from Avatar 2 and 3. And the replies are pretty cheesy. Not the smartest AIs out there; like, it's pretty obvious you are talking with a d3mb chatbot. That kid was weak physically and mentally and should have never had access to technology in the first place. And somebody put the phones, the tablets, and the computer in his hands in the first place. No kid just wakes up saying, "I want a phone" if he never saw others using it in the first place. "I would share some screenshots here if I could, just to show you all how generic this website is (character AI). It's nothing special. And you are pretty limited to what you can do. You have to be extremely low mentally to do what this kid did, to terminate yourself because your AI chatbot isn't real. What was he thinking, "let me turn myself back into dirt, and maybe one day I will wake up inside a computer running on a couple of chips and RAM"? And because of people like these 2 clowns that decided to have multiple kids, whom they are calling "my best friends," now we will all be screwed later on with new laws. I do recommend you to go and make an account to see for yourself how generic the website is. It's a parenting problem, not a website problem.
youtube AI Harm Incident 2026-04-04T18:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugzp3CBXwWVh9m3VlBh4AaABAg.AVTOu7d375wAVveY_yTp6N","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugw5eDqb-mk52mjM2ix4AaABAg.AV3bf-jy71aAVBZjXKz1F7","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxun81mUuFIcFVvc4V4AaABAg.AUzcISSQG7TAV-oWeywqDu","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzKKV57Ydii50amW294AaABAg.AUzYwc825ndAV6A2aZSG8o","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgwHeWatKLfihCkVTk14AaABAg.AUzMOXqAyPIAV-8fYfSOvU","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugxqoc4tPpks-pzRi-t4AaABAg.AUyV6DYph3yAV-6-Jp13PM","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugyew7c6YWASRy4_X8B4AaABAg.AUy6hwIa7BtAV-9IYThHSX","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugwfw4xCaa9f4PaqPil4AaABAg.AUYWZ_yRx74AVgdObBP40a","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwUQE_zmswOh8MeyKh4AaABAg.AU8L_kc93FCAV1L_TnKbx4","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgwxshGleS3KjuICXd94AaABAg.ATmy5MGD8OTAU0OYbSvCrU","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"} ]