Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't have a problem with Greta, but I don't understand how there isn't someon…
rdc_fap7j26
G
So why use ai in the firstplace? Shut that ish down so actual PEOPLE can continu…
ytc_Ugyu8Tdiy…
G
Also false equivalence since using ai graphics on a handmade character isnt the …
ytr_UgwA7iE_D…
G
For now AI can't be controlled by itself, it needs a human to do something. The …
ytc_UgyfzuhFC…
G
Coming automation wave? Automation has been going on for my whole life, I'm 53. …
ytc_Ugx7K-w3N…
G
Let's talk about how overleveraged AI is. As much I wish an AI company would bit…
ytc_Ugy6vZRvy…
G
I just let the AI play the part of a very knowledgeable friend- its my AI friend…
ytc_Ugy4YJktw…
G
On the point of artists taking elements from their past and lives and putting it…
ytc_UgxWQ21nz…
Comment
As somebody who dabbles a fair bit in machine learning algorithms, WHY IN THE FUCK ARE YOU USING IT TO MAKE ARRESTS? We are not at that stage yet dude. They are good for business and convience but by no means are they ready to be used for our criminal justice system because they simply are not accurate enough for that yet
youtube
AI Harm Incident
2021-04-29T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0OU1nw5GJoGq7Qfl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqDyFa2aFxTRHo5QZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw6pm3nPVfjzfQVNlx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzZuGN6n4ZbIO3CqaF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyuiv4z4S4Omguq6VF4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw9b4HJE6EGBCDZecp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzdt-Fz_BZRqkGQ1LB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz9LxkWlNtSQ4Yvi6N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy1KB6KYkvea8T_qqt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyxFo0UYgFW936V7vV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}
]