Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you ever do something like that or anything write it on your body somewhere with a sharpie. Just one word. Whatever the thing is, because you might not be able to say it just like this guy. And if you're lucky, they have a cure. That's if they have a cure for the thing you ate. Even if you don't actually get hurt you just think so, it's better to have the words "ghost pepper salsa" on your arm and go to the hospital for nothing and look silly than it is to get poisoned by anything. Even eating just too much of something that's not poisonous or drinking too much beer or anything. Call an ambulance and tell them over the phone if you can but if not you've got it written on you. Or tell someone first before you eat it and then when they find you sick on the floor they know exactly what's up. That AI is not gonna speak up out of your phone, dude. The fact that he did this alone is even worse. Put that bromide package that says sodium bromide on it and put it in your pocket or something at least. That way they at least have a clue when you get to the hospital and bam, you're treated within like an hour of getting there hopefully. They literally cannot detect some things even when they're looking for them and especially if they're not looking for something specific... Somebody interested in chemistry should probably know that... This guy is just unbelievable. Hard headed but doesn't even know anything about chemistry and that was his downfall. Thinks he smarter than us all and all doctors and everybody. Even smarter than the chemistry books. And people told him he was wrong! Real people, not robots. And still didn't listen. Why would you put yourself at jeopardy even if it's a science experiment? Ever heard of using a bug or a mouse or guinea pig or something first that way you survive??? Wow. Maybe he did and the dead rat just pissed him off lol. So hard headed.
youtube AI Harm Incident 2025-12-18T20:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwrMzrM1Ry7fQrudf14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx7NvcGperRvEBEE_14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwkaaJgUOQ-s4d_gpR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyR4tJR38wyxuvZUvd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgydSDdXHwUO_PwRKDx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyNzpK7RNJErkNXGjt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwbaOliPlbUm8oz4ZN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgxV9D6ntJ1w3Jgd8ot4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz4izSyDEfw6WTsj-l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"disapproval"}, {"id":"ytc_Ugw8go61NJrpBPJKI8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]