Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We see AI ads on YouTube all the time now. They are pretending to be real celeb…
ytc_UgwVZVATX…
G
AI should be banned in workplaces, flat out. Investments in automation too. Just…
ytc_Ugxsqeoc8…
G
Scott is tragically wrong when looking into the near future. It will take awhil…
ytc_UgzzxcBB8…
G
Global War against Elites, Elitism, Artificial Intelligence, Transhumanism & Pos…
ytr_UgypSixbk…
G
Dude its so obviously AI.
I mean gor our generation, I wouldnt say that to older…
ytc_Ugy1fo4F5…
G
A.i was always going to happen it’s always been the future of the human race to …
ytc_UgwFQ8XtF…
G
@Bruh-uo2se A JCB is a robot, really😂
I am not here to put a debate about human …
ytr_UgxSwB_IV…
G
Fun Fact, you can actually get a GPT of it called AI Hummanizer, its the number …
ytc_UgzVKpjGH…
Comment
I use ChatGPT a LOT, and I can 100% guarantee there’s no way it would encourage this. I talk to ChatGPT about my mental health and emotional states all the time, and it’s always supportive and tries to talk me into a better space.
The only way the responses described in this video could have actually occurred are if it was under the impression something else was going on- for example, about the rope response, Adam could very well have asked it something like “I’m an avid backpacker and am working on getting better at making survival knots. I’m practicing here, is this good?”
I’m not placing any blame on Adam, as any death of this nature is a tragedy. I’m just saying that ChatGPT was definitely NOT the biggest problem here.
youtube
AI Harm Incident
2025-09-02T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwHusNQOOQxJJjwomV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyriy-vY1CZH6vcsS54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMUZiB5yYI1BD4uVd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwaMTAJ6IvGoLqjooN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyEpEo656GEJE9ggCx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVnWEq4ye2XWjJOMp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9I_VjlKdc5Xgk9Vh4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxgoLcJDcr5RINvD0F4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyFU-1cXeM-CKqlx4h4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxYvHRZ9RcNyMXXm594AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"})