Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To the Detroit Police Department: Get rid of facial recognition technology becau…
ytc_UgzJvHku_…
G
This subreddit is full of asinine posts that people think are incredibly insight…
rdc_ohzmxky
G
Can we move toward calling "Ai artists" "prompters" instead. That is what they a…
ytc_Ugwjbgehf…
G
The worst thing that separates us from AI is, what always gives the worst result…
ytc_UgyYlD_4y…
G
Germany won’t be affected by this scenario — it’s still running on fax machines,…
ytc_UgydsWrUa…
G
AI was not invented to make society better, it was invented as a profit generato…
rdc_m29anns
G
The ‘creator of a ai system that is worried about the future of ai’ movie cliché…
ytc_UgxmQWYkj…
G
A 12 year old baby artists Naruto fanart will always have more value then some t…
ytc_UgxtQ1BDK…
Comment
Bottom line base human behavior is most likely built into the data of these AI forms. They could easily be programmed by us for possibly nefarious deeds... I remember information several years ago about a government that was using robotic creations that were armed. They were using them to guard their boarders.... I think they had some real problems with these machines. I want to say South Korea...but I could be wrong....Anyone remember this ?
youtube
AI Harm Incident
2025-07-26T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwrnJ6m11bip-14br14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwHN5t8C_EteVstzRd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxrQ8YBvkOvM9y5b2R4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQfwE-qdyu84G1ZKt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyEVYBx9xwxpQPQ2_F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgyIVkwQxCU6Np7Mwkp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwHPQJBdw8siSdloXF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1ZNQw2rhUMkVe1IN4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxHFrkkmgPdUN5Q7nB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyPa6MKDkrCBr30LvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]