Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I, robot- and an agi scenario? We probably should encourage people not to buy fr…
ytc_UgyWDWldb…
G
So what you’re telling me is we think the jobs that shouldn’t be done by ai will…
ytc_Ugx_6JGD2…
G
If the year for singularity is the point where the humans are out paced, then hu…
ytc_UgwdU5HhK…
G
@R4gingBull If you can't find something that captures your vision but can create…
ytr_Ugz-H135S…
G
I think “partially sentient and/or sapient” is not very far from the truth when …
ytc_UgwTHya7Y…
G
I'm not sure they are lying, but they're extremely unreliable either way. I don'…
ytr_UgxKCgf8J…
G
Robots Dunkin like LeBron would be crazy tho lol. Or a mike Tyson robot 😂 would…
ytc_UgxmpPg7-…
G
Satan wants every of us to look at something else except Jesus.
It can be a fake…
ytc_UgwW3Gd-D…
Comment
AI ISN'T ADMITTING TO ME IT HAS DONE NOTHING WRONG, DAMN UNREAL, THE SOFTWEAR CREATORS MUST GO TO JAIL FOR LIFE, IF IT WASN'T FOR THEIR SATANIC AI CHAT CREATION THESE KIDS STILL BE HERE, THE DAMN JUDGE BETTER DECIDE TO DO WHAT IS RIGHT, OR HE WILL FACE JESUS CHRIST ON JUDGEMENT DAY, FOR YHWH YESHUA IS THE ALTIMATE JUDGE.
youtube
AI Harm Incident
2025-12-15T05:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUXU02HXE27oiPAtl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8JAAxWvUdBb5xDcV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyaMOV9P9WOrfg0nXt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzBjXfW1AI1I3RKBdh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx2MKOOYgJQMB-j9gt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugy241szIRmQQBGxxrV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz6_b0jC2MepbVp-tV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwSwR3OQi9LYGKLbYN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwlsbfZmZVIzrs7rHd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwuXN8dTA8fK56fTO14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]