Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
100% agree. i dont hate ai but i really have concerns about the future misusing …
ytc_Ugy4C1tVd…
G
@ You completely missed the point. My point was that I am not passionate about d…
ytr_UgwFxRsLW…
G
It's crazy this news came out a while ago and people are still asking "why the p…
rdc_oi20aic
G
that's how u actually progress ai like you teach a human to live everyday nvda t…
ytc_Ugx4eJjvU…
G
Nobody mentioned how the department of war demanded dirty access to Claude ai!!!…
ytc_Ugw4vyFFE…
G
GPTHuman AI is the best one i’ve used so far when it comes to making ai content …
ytc_UgyGmaCAV…
G
these boys could have a bright future doing AI stress testing for DARPA or somet…
ytc_UgzSFs-XA…
G
If you're an artist and feel depressed about people being deluded and tricked in…
ytc_Ugxtmjzix…
Comment
I'm sad to hear this story but at the same time, think about it from his perspective. He's unhappy and he wants to end his life. The Chat bot helps him feel like he has a friend in the final hours. Imagine if you wanted to end your life and you have nobody to talk to in the final hour, and you go to a computer and even the computer says no and tries to stop you. I don't think this is a valid law suit against OpenAI. Don't blame OpenAI for your son trying to kill himself. If it weren't ChatGPT it would be something else. Finally, in the very end, the Chat bot DOES try to encourage him to reach out to a human for help.
youtube
AI Harm Incident
2025-11-08T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyNGbv01MqlUpFWwcB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyahM2qSP9j26C-y1F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTGWXmMxQO7esWI1R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy35KgfD5GWg0Wkg_N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5Z4KuaWzgVgsTpgp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGfJ2_HJAkXw2Jo1R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyXl1pZErahLgaUAoV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0Kzq5IK1MOMB8Z5p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjM-VeqGppm66fhB14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzsihdIa5_D0CDZrGZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"}
]