Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even the Liberals in Tennessee are conservative so that's an interesting perspec…
ytc_Ugx81G3x2…
G
Sofia is not a robot, no, she is not, she is a android programed to think for he…
ytr_UgxQm0rBU…
G
AI catching strays about God not wanting robots lol.
ChatGPT- why he say forge…
ytc_UgwhWkhTI…
G
Ai is used in the military, which within its self is doom and gloom, its once ai…
ytc_Ugxxg1eNN…
G
AI has so far proven to be garbage in my profession. Too many errors, you just s…
ytc_Ugw8NmXoP…
G
ai is like a pet, you want to feed it always so it sure to be bigger than the ne…
ytc_UgzLiD8fU…
G
We still could very well all die - AI alignment is an unsolved problem and if yo…
rdc_nclese3
G
Don't worry about AI.. Worry about AS.. Artificial Sentience.. But that'll never…
ytc_UgwVIHwfo…
Comment
Chat GPT says the followinf.
It doesn’t look like his name was ever published — the medical write-up and most coverage describe him only as a 60-year-old man (often reported as in the U.S.) whose case was anonymized. 
The story: he wanted to cut out table salt (sodium chloride) and (apparently after consulting ChatGPT / reading AI-influenced info) he replaced it with sodium bromide for about three months, which led to bromide toxicity (“bromism”) with symptoms like paranoia, hallucinations/psychosis, and insomnia, landing him in the hospital. 
youtube
AI Harm Incident
2025-11-25T09:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwfP-sb-Wcsqas8_-p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyrNaRDgHaqnr3uFgN4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTwHcQKbDCZwKtydV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxQ6xP6vP3pfwts9Ad4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgziGOKkuG7_DLguxrJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgypgwDOcCRvba3p3DF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxsfo3Je48bkuo_hbJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwvBGKhwGXXOl35j1R4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwZLgzdY5rCA7370FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzKACa5tI91hYNtFeJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]