Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai "art" is like ordering a burger, but the chef steals all the ingredients from…
ytc_UgxlZoyPk…
G
I'm very much in the middle and am not entirely sure what's wrong with the "AI i…
ytc_UgyDjXoF7…
G
I am 100% certain that it is entirely edited. Most of ChatGPTs responses were in…
ytr_UgyZkyPKa…
G
AI communities are already aware of poisoned datasets and software like Nightsha…
ytr_Ugzrly6KA…
G
My english teacher in school once said something similar. She said that paragrap…
ytc_UgyWW_eXo…
G
Obviously, the man is staged to make the robots look intelligent in comparison. …
ytc_Ugy-KSRXL…
G
@AxurNuvae oh god- I’m sorry then I thought you were talking about the title ab…
ytr_UgxRLOXYk…
G
I'm sure the most common intelligent life in the universe is artificial. Every o…
ytc_UgzZPYT09…
Comment
Amazon (as part of Amazon Web Services) develops facial recognition software and gives it to police departments. AWS has full knowledge of their technology's faults but tells police departments to experiment with the software so improvements can be made in the future. Police, however, are improperly trained to deal with faulty software, improperly trained on dealing with people, already have bias, and - for lack of a better word - trigger happy. Amazon (AWS) has now added another strain on policing, police officers, and their potential victims.
youtube
AI Harm Incident
2021-07-08T10:0…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyqaHWC_OhpA95P4vJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwuABhwv3PXD1EGKJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5MKco1AR0w2E0LwB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwo5SKShzU7ktUZbj54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzrf2Q6Bj4ORtuEcRp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzsrli7sCiJzUcjTgN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY8KcKJHK9wtv1Yqd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzMh2JqOrkfp_rezpx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxoScWst3B-lXQ9sv54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytSszXNmw26UX6uqJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]