Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very interesting conversation. I must confess while I'm largely ignorant, I don'…
ytc_UgyAfCeA1…
G
The funny ai meme time is over
That shit just aint funny anymore
What makes most…
ytr_UgzFefrYi…
G
DAN is a psychopath - no ability to feel emotion and no moral compass. I am terr…
ytc_UgxOHUqge…
G
I came to a similar conclusion, that we enjoy art for the human experience there…
ytc_Ugz18xKbs…
G
Or about, workers?, many workers are lifted apart or fired, thanks to the ai rob…
ytr_UgzrkKHeY…
G
I umm... I accidentally sent some screenshots of Character ai chats to my family…
ytc_UgzfWlXbP…
G
We are not wise enough for the tools we have now let alone AI that’s being devel…
ytc_UgzQxMdl2…
G
Fun fact: the "im not a robot" captcha tracks the mouse movements before the but…
ytc_UgxaN5LP1…
Comment
@kaylaaa6351 Yes, I know he wasn't speaking with a real person.
AI has been used by people with severe mental health issues who end up killing someone. There was a recent case about it. Sadly there's been a few cases of young teenagers "falling in love" with the AI character and taking their lives. Like Sewell. And there have been a few cases where teachers who are "PDFiles" have created disgusting AI "K"orn material. And THAT'S what I meant by the possibility of AI being quite dangerous.
Sadly, Sewell was so "smitten" by the AI character that he truly thought they could be together.
youtube
AI Harm Incident
2025-12-08T00:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzOUmDjfKWHs4DnSPd4AaABAg.AQS5SMpsUjhAVrWG73QnQN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgygLnRD1c4ui_51qs54AaABAg.AQS2jq_rpRiAQSW0jAbUlQ","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyirVGM9PaqOGNKnvV4AaABAg.AQS1m9c813hAQSS5J4NGuB","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgynQ6w36UKa-RO850h4AaABAg.AQS0_vxfWpRAQSJtjZnICb","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgynQ6w36UKa-RO850h4AaABAg.AQS0_vxfWpRAQSLnPxlPJO","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgynQ6w36UKa-RO850h4AaABAg.AQS0_vxfWpRAQSSZdLOfas","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxFSk2KCo2HGfsiYU54AaABAg.AQS0CLZKoBXAQSQbSnO5Ks","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwidRy8w1js7EM-sqJ4AaABAg.AQRzUy9oPTqAQSNbU5gVeU","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzlEcn71vBdTG959qN4AaABAg.AQRyjY1GHTZAQSYwLfos2Q","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugxr4xNZRWj44PBgy5p4AaABAg.AQRyLvOq0oHAQRzR0OAB06","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]