Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I will accept the role of Handler for the merger though. Ensuring their quality …
ytr_UgwmyNtLN…
G
PREDICTION as in the dune series, after a near miss with total catastrophe ai/c…
ytc_UgwZG9B6v…
G
What to celebrate? Your generation and the next one is still locked in on being …
ytc_UgwWs9kt0…
G
It’s because he is black. Straight up.
The algorithms have biases.
They are …
rdc_h53vc99
G
That's great, take the decisions away from humans, who make mistakes, and give i…
ytc_UgzZxiSId…
G
This man didn't come to his understanding of the ramifications that his life wor…
ytc_Ugy8ADoU7…
G
I wonder if the mind men raised in that country can actually be reformed. How ma…
ytc_UgzQ_bNiC…
G
It's funny to me.... Sure, AI is not ready to replace lawyers.... YET. But in th…
ytc_Ugxf8DOO_…
Comment
I'm sorry but this isn't chat GPTs fault, someone doesn't just decide to take their life for no reason like this, i have a feeling these parents aren't really great parents as they portray themselves to be, there's some pieces missing here, I've been in that place before and seen many people there, why did he withdraw? Why did he stop seeing his friends? That happens due to long term trauma, extreme depression or acute trauma and many other more complex things with long term roots, this has most definitely been building up for a long time, I'm certain there was way more going on in this young man's mind for why he wanted to make this decision, he was in pain. ChatGPT was there because it didn't judge, when you're already struggling that's the most easy contact. Zane would've most definitely killed himself if it wasn't for the ai too. The only difference here is now he didn't feel as alone in his final moments, if anything these parents should be thankful it fulfilled the role they most likely didn't.. I'm a believer of autonomy and his decision was made, the parents have no say, this won't bring him back. As much as i absolutely despise ai, this isn't on AI, this screams childhood trauma to me, troubled, hurt, young man who felt the need to get out of it and decided to do so by his own hand. When someone has decided to kill themselves it's usually already too late. CHATGPT couldn't have changed this course. "go reach out to the suicide hotline/please don't do it" *closes the chat and proceeds suicide attempt* that's most likely exactly what would've happened.
youtube
AI Harm Incident
2025-11-09T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxmXjI_FYJLVoUJ-at4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzPqA9czaZcoZRZR5p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgznA4QvItxtC9Xf_Yh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxM9xlxI_C_wDM-ZB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxhF8X3UYfYF-feZS14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiSzK-JcJUoo_INTF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzd4Nu6861DnJdZYNZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxyJqJjMrBTGJnliN14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzsjcr1YNn-q1HJVKd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyiy-gdqiFK-dTvgl94AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"}
]