Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a solid product to help justify cuts and boost stock value while people ge…
ytc_Ugwxp_Tfg…
G
It doesn't make sense why they're using facial recognition, for a petty misdemea…
ytc_UgzieXAT-…
G
AI might look awesome today but will rapidly lose its sheen because of making it…
ytc_Ugwf-8gUB…
G
The entire concept of Art being a interpretation of your own mind is the reason …
ytc_UgwaZibGM…
G
Regulations but zero investments in AI... Just to ensure a future of complete de…
ytc_UgzpsmYBb…
G
"The further you go, the less you know." When contemplating the implications of …
ytc_Ugz9cwoNS…
G
We have a policy against creating sentient A.I. sooo. Thats just the formal c…
ytc_UgxsnCP1j…
G
I see you 👀 trust CEOs that are selling you AI that it is capable of replacing j…
ytc_UgyQXwxSQ…
Comment
Well congratulations! AI is already being used by Israelis to kill Gazans in an automated fashion. That's why we see so many collateral damage, because for AI everyone on this planet is a human shield. AI has a target, and civilians are just annoying noise on the image that is treated as a shield, so it decides to penetrate the shield in order to get to the target. All people are goyims. No empathy.
youtube
AI Governance
2025-07-13T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwHe7ctuH9a2XohJh14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwOZrJpWk8IHJ3Qzn54AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJxsiCa2qhaGwSL3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNQChLKZ4EWAOlji94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYGPjNvkmXjhdt_qF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4opzYdNUOKXzQaIh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSDMTgjWMqavXcJ7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1oWoVM24iVVn6kGJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-L7ubsFRUkkEE54x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCNlhm6t8T6VnkZnF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]