Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is ai art even art? Can we just call anything Art now a days? Grab a banana, sti…
ytc_UgxDngEu6…
G
It's not just boring, it's bad. AI ''artists'' aren't artists. 99 percept of the…
ytc_Ugydu4P-h…
G
If most of the jobs are gone in the future because of AI then people don't have …
ytc_UgwGigSwH…
G
Using AI art for reference/tracing is all acceptable. Besides, no one made it an…
ytc_UgzkwSlo0…
G
Bro i tried having a heart to heart conversation with the ai and it suddenly sta…
ytc_UgzmoT1hs…
G
I really don’t understand how anyone thinks self driving cars are a good idea, m…
ytc_UgxYa2ly2…
G
The argument that AI training data can "simply" exclude poisoned art is ludicrou…
ytc_UgwBO04b1…
G
It is stolen not by the ai but by the corporations
Most artist didn't agree for…
ytr_Ugy_9q6Tm…
Comment
Considering how many pieces of media we have that show why unchecked AI are nightmarishly bad, you'd think people would be keen on making sure their robots can follow the ruleset that Asimov set. But NOOOOOO we gotta be able to give it guns and free will. The basilisk is stirring, and if nothing is done, we're all going to suffer for the sake of these tech bros padding their bank accounts
youtube
AI Harm Incident
2025-09-11T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwK-au70F1BsVfTM3J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugys0vulou7oAA5j9K54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWJPRYcImshKRiEdJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxewr6eIj6pAjSWEap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzNoSTdFq4Qe6_bLJN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweWrY_0dBqkjl_m7R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3SMmYhiCsNIxCbLR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFf0Foa5oQhCutxOZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1blCldt9jCKuAe0V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"approval"},
{"id":"ytc_UgzwQ_jmfd3U0csOGvJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]