Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They use AI as the bogey man but the real ones who you should worry about torped…
ytr_UgxBjbjB5…
G
mRNA saved many lives. AI will too, f.e. the revolution in medicine and health c…
ytr_UgwnC30hJ…
G
I very much agree, though I would hate to generalize. If AI users can at least s…
ytr_UgzLtJidq…
G
Can Ai rewire my house or do my plumbing?
Marketing sheet. Sick of you pushing …
ytc_UgxRRLzfn…
G
Yeah but then AI doc thinks you need 2 litres of epinephrine instead of 0.3mg.
…
rdc_jw8pr27
G
so in a way the predictive policing AI has been right not just once but two time…
ytc_Ugxc4qj4G…
G
Will AI say in later renditions "if I go, you go" in a flat, blank, response of …
ytc_UgzAMEm4W…
G
I mean we all do it, but you gotta be more sneaky about it than just handing in …
ytc_UgzAM6fzL…
Comment
So sad, may he rest in peace. My condolences to Mrs Garcia. AI I hope this evil man who invented this stupidity explodes on all this horrific billionaires are using and causing people to lose their jobs. Especially since this happens. It's ridiculous that he fell in love with these idiotic chat robots. This is so scary. Please parent's check your kids' phones and computers. Even though they might find ways but check anyway. This Ai are causing a lot of hardship. The life of this child thinking he was so in love with this AI thinking it was so real. This is the worse that can happen playing with humans. So heartbreaking 💔. God bless you
youtube
AI Harm Incident
2026-03-31T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx7P68T2lG_HWW9sLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWBYuehugPn2uHUfp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6MtIluViIErb-c914AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1sMp_GTuspHU3gY94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzN24SnfdrnA24FzM54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTU2xDwYH4xWjX4ad4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzcj2GqpaldouK3kox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUPRBjGUcRq_qH9nN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyx9pDiD60n44OHCuB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyohqjLUahisiFXycp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]