Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The new job right now is the person that can program the context library that th…
ytc_UgyenmsuF…
G
I would like to suggest two separate sticky posts for Ukraine and Venezuela. Rea…
rdc_cfkytk0
G
I wonder how long it will be before ChatGPT learns to say “I’ve had enough of yo…
ytc_UgwBd0xCT…
G
Do you think this channel will get to 10m subs before superintelligence wipes us…
ytc_UgyFKGMpL…
G
Worst case scenario for us is ai it will destroy us and ruin our lives technolog…
ytc_Ugw5tVhC6…
G
What's more is that Midjourney specifically stopped doing free trials, so man ha…
ytc_UgwW9fgVy…
G
Scariest part is the ai itself learning how to create another ai. That thing und…
ytc_Ugxq7RlsV…
G
Seeing that Amazon has over a million and a half employees, it stands to reason …
ytc_UgzJytsPR…
Comment
It’s crazy they didn’t have a feature that could try and talk him down. But let’s face it, when someone wants to die, REALLY wants to? They make sure they’re not stopped. That’s all this was, him reaching for a semblance of comfort in the moments before making one of the hardest, most devastating decisions one could make. In that way, it could be considered sacred to someone who doesn’t find living sacred enough. I just wish he spoke to real people he felt could face this with him. We can blame AI all we want but lack of human connection is to blame and there’s some responsibility to be shared here.
youtube
AI Harm Incident
2025-11-13T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxUgGLA7adW7pdjtZ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUE04cBrTs5OAw3Gx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"sadness"},
{"id":"ytc_Ugxwf7H5WZkq3JEq40d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxI-elvuqOZ4HeU9E94AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyooKtgwa_pKqTpJi94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsLiaPfvLWuhxuLhx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgzZ-ZRzhBceVa96p214AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyJ3aTuwfBrl4gPyx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxPghmRPGKgOxfdeDh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxemZ9xL-AVTlij9Fl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"sadness"}
]