Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Men and women need to stick together. Why deal with a real marriage when you ca…
ytc_UgzK4ZN6o…
G
They already own 14% of anthropic and have for years. The game plan is that they…
rdc_oi1zlqk
G
The whole world should stop working… they won’t be able to boulot robot to repla…
ytc_UgxL6HrHU…
G
lol tim trying to have an informed discussion while his burnout friend talks abo…
ytc_UgzTECCB-…
G
AI was just a facade for shareholders, most if not all those people were quietly…
ytc_UgyOW2Yzz…
G
Smart enough to invent artificial intelligence but dumb enough to think we need …
ytc_UgwZU_X54…
G
This is what happens when you give white people access to AI they destroy everyt…
ytc_UgyWuyoEo…
G
That is ridiculous and you know it. Artists don't have to pay everytime they tak…
ytr_Ugy4VFWIE…
Comment
Well suicidal thoughts may exist in some people whether they use AI or not, but on some occasions, ai can make them lean towards the suicidal thoughts unnecessarily. I completely understand your point, don't stop someone from doing something they truly want, but if a bot is ENCOURAGING it, that's where it's dangerous. Ai doesn't know the age of the person behind the phone, what if it was a child they were saying these things to? There's too many factors that go into it, but the point is, no ai should be encouraging delusions or suicidal thoughts, it's harming many many people. Just because the companies want more money.
youtube
AI Harm Incident
2025-11-07T20:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxqsVUtsAIJW8drb7R4AaABAg.APEfQILjkCnAPF73BbcVvG","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxqsVUtsAIJW8drb7R4AaABAg.APEfQILjkCnAPJ6i3RitF0","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzWG0D_3cRNezh65vV4AaABAg.APEfKFG1efsAPMWPpSuEMg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzWG0D_3cRNezh65vV4AaABAg.APEfKFG1efsAPMYYWYOUSD","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_UgwIkPeK6bOxsGQdw8J4AaABAg.APEecNDvBFZAPEuIU9_3M4","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwIkPeK6bOxsGQdw8J4AaABAg.APEecNDvBFZAPEzllBsyhT","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzZKa6AevIp4SMPu1N4AaABAg.APEeSuBg-EBAPEgNibxx-W","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugxp77jKXIBFNu2rmhB4AaABAg.APEeLOhJQ88API3ewXtNEa","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugwr--PlSRQfR6EOUr94AaABAg.APEdtMTRkLhAPEoj59oGxr","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugw2yCmmzbiRX3Xo8bZ4AaABAg.APEdlB8d1qkAPEnkv7rUd_","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]