Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You just forgot to mention that not only people are using AI to help them to get…
ytc_UgxeJ40G8…
G
@GhastlyCretin
I don’t know if they always do their job right. AI is very new …
ytr_Ugy5PX2wW…
G
Wait. Why don't we just act with superintelligent AI as we are with 200+ IQ huma…
ytc_UgwcgEHAq…
G
I think history has already given a verdict: whenever a country makes laws restr…
rdc_gnazfnv
G
what i always say ab AI is that when they try doing 3d stuff it just looks way t…
ytc_UgyQ3bA93…
G
Were going to make a bunch of ai trucks follow one human driver in a set road pa…
ytc_Ugy6DD9eJ…
G
And why is this a good thing? The guy has 140 000 likes on his ‘artwork’ and all…
ytc_UgyurbWf5…
G
My God, it's AI grow up it doesn't matter. I like art, but jeez, just let the gu…
ytc_Ugxutsi1E…
Comment
The goal of this short film project is to STOP smart drones developing, or it could end like this horror.
https://www.imdb.com/title/tt7659054/?ref_=ttpl_pl_tt
"In response to growing concerns about autonomous weapons, a coalition of AI researchers and advocacy organizations commissioned this disturbing dystopian film. It depicts a disturbing future in which lethal autonomous weapons have become cheap and ubiquitous. This video was launched in Geneva, where AI researcher Stuart Russell presented it at an event at the United Nations Convention on Conventional Weapons hosted by the Campaign to Stop Killer Robots." - IMBD
youtube
AI Harm Incident
2018-12-19T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwxt0ZYo_8UPxhmMeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwzrvj0NqodLMa21Ft4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx75cHjhyOWxs_QjaJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy7kqa4H6EtheQE54p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyeymu0Ei4-PrBjm5h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMdf4OLHL9QnNF-w94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaLE2xshherIxxKNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2pkXyhE-U4r1TRct4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyTu7Z0jdApHPa2U7R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugzqi_jhUKJQ-9pRT1t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]