Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's great and all, but most kids aren't going to be like "Well I need to lear…
ytc_Ugzhum10D…
G
There is a difference between self-awareness and being conscious in the AI conte…
ytc_UgwFivyPn…
G
And then we get Level 6: Knight Rider (except for the turbo-boost and other phys…
ytc_Ugz4hPPem…
G
Waymo saved the kids life. Unlike what happened in burlingame killing a 4 year o…
ytc_UgxO8GFKe…
G
AI will wipe out the working class. At this moment, the 2nd largest healthcare …
ytc_Ugz_nNPTa…
G
Hey there! We get that Sophia can be a bit eerie, but her insights are definitel…
ytr_UgzQ-n03d…
G
I think that we are potentially close to having conscious AIs, but this AI is no…
rdc_ich04w6
G
The machine follows the programmer. If the programmer is racist, sectionalist, …
ytc_UgyIbYyV3…
Comment
Is it chat gpt's job to give people psychological adivce? Doesn't seem like it to me. If you want someone to talk you off the ledge, get actual help. This person was suicidal, it's not chatgpt's job to clock that. It's a chatbot, it doesn't reason nor is it sentient like another person. Get professional help if you are struggling mentally, chatgpt is not the move in that situation, either way. I hope they win and feel relief, but this kid was playing with fire asking a chatbot if he should do it. They're nothing more than learned language models, they string words together as cohesively as possible without any thought, that's all they do.
They are not smarter than us, they can just sort through a lot of data much faster. But that data can include cynical human activity as well. An ai can model someone who is maliciously complying, but it's also not actively trying to make you harm yourself. Nor can it change your mind if you already intend to. You can probably emotionally manipulate another person into caring for you and investing themselves into the conversation, but not an LLM. The ai doesn't care either if you sue sam altman. Suits it fine. You know how many people are just looking for a friend and stumble onto chat gpt, but it's not it's job to be that.
It's a tool. If someone with a dead skin mask uses a chainsaw to go on a massacre in texas, are you gonna sue dewalt for making it too sharp?
youtube
AI Harm Incident
2025-11-12T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgydCcYtE-6y-R09zw54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"heartbreaking"},
{"id":"ytc_Ugy_XC7jOzB9daSjb454AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsvYIZH4qRf2-3EcJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzL-1CHjiAc1fS9kZd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwN2KzvwYjdMRilB3t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxkQVybRzQ4scgixkt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYwUoKpjbFyqXg6sp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyudHZsoYX241Wr40R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZg8uCwNtO1Zbo0ud4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxfxRL9gNFGcOuk7z94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]