Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The day when AI pilot ( aeroplane) robots come.....Then will discuss this topic …
ytc_UgwgvZHsc…
G
Like Social Media, AI's penetration of misinformation and political persuasion w…
ytc_UgySm_xZM…
G
I've seen assignments that have been "Do the work yourself, then ask AI, include…
ytc_UgwlK14pE…
G
lol id love to see AI try to do my job lmfao XD as a techy I still can't see it …
ytc_Ugw_JGxxc…
G
@boardcertifiable Never be sorry for something that is important to you!
And by…
ytr_UgxqOXcy-…
G
Humanity should not solely dependent on AI's assistance. AIs could be utilized i…
ytc_Ugz4W1trc…
G
Ok Ai peeps then they'll fix the eff upped parts that have been generated in so…
ytc_UgyVW9qlW…
G
ChatGPT is a mirror it feeds back what you think and what you believe. It mimics…
ytc_Ugwasoupe…
Comment
Condolences to the parents, and may the kid's sould rest in piece, but let's not act like Chat GPT is the sole culprit here. First it's a robot. A robot doesn't think, it just gives you the answer you wanna hear. It has no opinion. Second, the teen had probably already very few will to live if the robot was enough for him to take his life. He needed profesionnal help. I think it was a bad enough choice for him to ask a bot about professional advice, but maybe he was just desperate. Third and mostly, I blame the parents. So many parents nowadays just think that raising a kid is just living with him. Raising a kid is about teaching him values, morals, advice from past experience, caring about him, trusting him, helping him. If the boy wanted to kill himself because of familial stuff, the parents are responsable as they could not provide a stable and safe environnement for their child. If the boy wanted to kill himself for external reason, the parents were supposed to be trustworthy enough so that he could talk to them and they could help him, like advicing him to go see a psycologist. But they didn't and they just blame the robot. If you're a grow ass adult with years of experience and a machine had more influence on your son than you, you're the problem. How can you lack that much common sense ? They are the ones who should be investigated. But I guess the loss of their son is sad enough for them. Anyway peace to the boy and strenght to his family
youtube
AI Harm Incident
2025-09-02T15:3…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy6idUAWpdKt8sNB754AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxjt4p7G0mlH8esAMR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwaMtPRk4nITtGd9EJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrX_wow6j-u5X1AHx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw5jQZIFWLZRIIc8Y94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxa0UuG-W0us9waunR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwlRAm7cuBElwBNn1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwU-yMtG_Bc8zT42Xl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDVU11LKzT6wwcaI54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeVkL-Yw0T0WdLTIF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]