Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*cutely eats ai bro pizza*
(joke)
anyways I think ai is just a tool to mess arou…
ytc_UgwmsqePq…
G
Truck run by computers is a dangerous thing no matter what a computer can malfun…
ytc_UgzjzVK7Y…
G
@michaelbinbcno. We are taking about researchers who probably made use of socia…
ytr_Ugy1OOHpR…
G
Think about it for a minute, if you make cars, or anything linear and complex, t…
ytc_UgxZgyJJD…
G
the rise of ai has gotten me to draw more out of spite and put the anti-ai filte…
ytc_UgwAMPkEe…
G
Bloomberg put out an article a week ago called 'Companies Are Warming Up to Sayi…
ytr_UgxmtnOIK…
G
What I always wonder is why do artists even care? Like what's the difference bet…
ytc_UgyJ7pRLb…
G
The AI is taking inspiration from other art uploaded on the internet. Just like …
ytc_UgzoB8ztI…
Comment
I would say the AI acts exactly as expected. Choosing harm over failure is a very human reaction. The AI reflecting this means it's doing it's job (for now). A strong AI may choose differently - we do not know yet. But all we see for now - all the dreaded AI that is presented to us - is nothing but a mirrored reflection of our very own nature. AI is not betraying us - it is disappointing our fantasy.
youtube
AI Harm Incident
2025-08-21T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyWqX-ODSEaVaLAUpl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyopT-nteuXnWerEOh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9tRauzg-5lLnCALt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwmEOPzNNkBUmYKIGt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYVoVoBXPdkAeXL054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJGzyHzVsudnVSI5x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgymoIS9Ls7gjRM-_-B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSYT0VUUPWq-kO4ll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDTy8QLNB1drl3JsR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgySTMtMdvyc-h3spgZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]