Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These huge companies that are sacrificing workers, to save a few billion are act…
ytc_Ugy9rYGXD…
G
what if combat video games are being used to teach AI how humans fight in order …
ytc_UgzwsRX4a…
G
i hate the that's a complex topic the problem is the a.i. has to do that or the …
ytc_UgyPbLc_t…
G
It's great to see a beautiful woman thinking beautiful thoughts. So often beauty…
ytc_Ugz6Ke4ut…
G
I love fast food, but I hate fast food workers. The sooner they automate the sys…
rdc_j3zl4jd
G
*me who just uses character ai for normal and funny rps*: ...wtf is wrong with y…
ytc_Ugxqme-Bl…
G
@zinxderobo lmao what an insanely unhinged thing to say to a random person, may…
ytr_Ugzlap5q4…
G
I know how AI would be most dangerous...but if i say it, someone will use my ide…
ytc_Ugyl7Xs8N…
Comment
@jamesbryson9542 The guy was mentally ill, so much so that he used an AI to treat his loneliness. In the end he used that AI for one more conversation before his final action. Men taking their own life is a huge problem, you not acknowledging the real reasons for the suicide doesn’t help with the numbers.
youtube
AI Harm Incident
2025-11-10T14:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyiKVzOsDScaRM6aM14AaABAg.APLod_ZtsbeAPlDEqyump5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwDzaWZUxwAt90ZeIp4AaABAg.APLoZ2rHVDBAPLsZFKLkM-","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyU6yVOi9EZGsTqfvp4AaABAg.APLnfXhBRIcASodz19XK_B","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugw0DKvdnoLRHDpwJMh4AaABAg.APLmmAz8167APZ742QTEcL","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyWIMioRFzIiutTKVJ4AaABAg.APLmWqmxZMjAPLzl88DfSF","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx6islBUQxnx7wsimN4AaABAg.APLmPvCj7y4APLoEjlbZtz","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugx6islBUQxnx7wsimN4AaABAg.APLmPvCj7y4APLtp9ndef3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugx6islBUQxnx7wsimN4AaABAg.APLmPvCj7y4ASoe8d7BBXN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgzIA-pemSZGiFRqkch4AaABAg.APLmMzNrTHnAPlGthCKfeX","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxD8sKec1sUK-ZQaOV4AaABAg.APLlPC9JXu9APLludI9Uoi","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]