Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
wrong, it just does what they tell it to do lol, based ai, it does it exactly be…
ytc_UgxbTnIN3…
G
Gave power to life on the image. Thats exactly what AI does. You can make a huma…
ytr_UgxYJDDJF…
G
@garyestcourt2377 _if the Tesla needs a guy sitting in the passenger seat wtf is…
ytr_UgwsECeLH…
G
It’s simple really. The man was secretly a box of vegetables disguised as a man,…
ytc_UgywJx1tA…
G
This is a good discussion. AI is hugely dangerous. The people like Sam Altman a…
ytc_UgyG4IuOQ…
G
Robots are AI and aren’t able to feel human emotion which includes, but is not l…
ytc_UgysyxFtl…
G
All this energy, noise and water to generate AI garbage. What a depressing time …
ytc_UgyYkAKF_…
G
> We are looking at a future where AI-driven combat systems and drone swarms …
rdc_ohvpz0k
Comment
Literally caught a doctor prescribing me the incorrect medication due to AI. Went to another doctor and they confirmed that it was the wrong medication for the infection and they also gave another medication that was contraindicated with the antibiotics they prescribed which could have led to sepsis… it’s not black and white.
youtube
AI Harm Incident
2026-01-22T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwrhhXFt70_E7ad25d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxn5797sryiA-qEcxh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz01P4JHO_knrNuSch4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwDR5g7_wFyEKX7XyN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugwd_nZUbQsa0go-O4N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKr-ahiqdVy7Pm2fh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzmyrdSJNTXnWYwIQt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxx2_5b0dT60jFov-t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyMK7ZIce0Hhh3dJVt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzbgLZQSKR82r1DOzZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}]