Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And the curve reaches 100 percent by 2100😢 , Humans needs to regulate AI usage a…
ytc_UgwHFKsgN…
G
WOOO!
And right after I delivered a speech to my class on AI-generated “art”—wh…
ytc_UgxNAXhes…
G
So I regularly use Chat GPT and other chat bots for my work, Yes it has replaced…
ytc_Ugzs3Go3S…
G
There is no way that the writers and actors can prevent companies from using AI…
ytc_UgxLEcG5v…
G
Is it only me or do the first robot she showed us looks like james charles 😭?…
ytc_Ugw6E6pQz…
G
Everyone is so gullible these days.
“The police did racist stuff, and used a ra…
ytc_Ugyis1Aat…
G
If companies can replace individuals for AI people can replace companies. Why w…
ytc_UgzrPBH6F…
G
Claiming that a person is owed copyright overall a learning machine when human b…
ytc_UgwEpUWIQ…
Comment
dude once there starts being a dozen or more confirmed deaths by these robots thousands of anti robot extermination organizations are gonna pop up and were gonna have server warehouses being burned down, computer component factories being sabotaged, maybe even some car bombings in robot assembly factories, its gonna be absolute mayhem, the -United- Corporate States -Government- Regime is gonna REALLY regret not stripping away our 2nd ammendment rights sooner...
youtube
AI Harm Incident
2025-08-22T08:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyWqX-ODSEaVaLAUpl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyopT-nteuXnWerEOh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9tRauzg-5lLnCALt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwmEOPzNNkBUmYKIGt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYVoVoBXPdkAeXL054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJGzyHzVsudnVSI5x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgymoIS9Ls7gjRM-_-B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSYT0VUUPWq-kO4ll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDTy8QLNB1drl3JsR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgySTMtMdvyc-h3spgZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]