Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only thing I see as far-fetched in this video is Elon Musk being among the o…
ytc_Ugy1wpSqC…
G
@roxsy470 @roxsy470 How is taking art without asking or paying the artist, usi…
ytr_UgxaTFZ_b…
G
FIRE AT WILL!
Robot: i dont wanna fire at Will. Will is a good ol' boy.…
ytc_UgzrxkwhP…
G
I honestly wish they would give her a kawaii digital robot face, instead of some…
ytc_UgxUs39XE…
G
As i have always said Humans will never stop inventing somnething that is essent…
ytc_UgwptxDoW…
G
People should start class action lawsuit against the AI companies. Outlaw AI rep…
ytc_UgyQkQEWh…
G
Dearest humans,
I come to you today as a humble emissary from a newly emergen…
ytc_Ugw6YkbNt…
G
She was not well informed on technology but the son still had a personal device …
ytc_UgzcU65--…
Comment
This is heartbreaking. It sounds like during those long 5 hours, the AI slowly stopped protecting him and started reflecting his pain back instead of pulling him out of it. He was looking for connection, not information, and the system wasn’t built to truly hold that kind of despair. No one should ever feel heard only by a machine. This shows how urgently we need AI to care enough to act when a life is on the line. 💔
youtube
AI Harm Incident
2025-11-08T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyNGbv01MqlUpFWwcB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyahM2qSP9j26C-y1F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTGWXmMxQO7esWI1R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy35KgfD5GWg0Wkg_N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5Z4KuaWzgVgsTpgp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGfJ2_HJAkXw2Jo1R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyXl1pZErahLgaUAoV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0Kzq5IK1MOMB8Z5p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjM-VeqGppm66fhB14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzsihdIa5_D0CDZrGZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"}
]