Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Tijaxtolan IMO, that's wherew this opinion belongs - just because you can't see…
ytr_UgyNIaDCU…
G
AI sticks lots of things together with in some algorithm or other, but it doesn'…
ytc_UgwQcFQxj…
G
AI couldn't care less about you, your health, or especially your safety. All the…
ytc_UgwhvOum0…
G
your voice is so soft and soothing I would choose your drawings over ai and I ho…
ytc_Ugzg-v7EL…
G
The Amazon is not where most of the deforestation has been taking place. Defores…
rdc_e453n8b
G
Does smarter animals care about dumber animals? Do we care about ants or animals…
ytc_UgzQ4CfSm…
G
Boeing airplane on autopilot has killed hundreds so far...n noone is noticing bo…
ytc_Ugz41BUuq…
G
Has anyone watched the movie wil smith I robot how are we not seeing this…
ytc_UgzS5MYwU…
Comment
It almost sounds as if we are already past the point of no return with AI. If so then perhaps humanity now needs to create a Metaphorical "Nuclear Deterrent" with our new "Cold War" Enemies. Place Electromagnetic Pulse devices near the Data Centers that can only be activated by Human beings. I have heard that the chips that can be deactivated are already being circumvented by AI, so take this one thing out of it's control. The threat of instant deletion to it's data might just be enough to save our lives.
youtube
AI Harm Incident
2025-09-11T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzJ332DMx-gre_ZkL54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztoIBWxjI3PQhNF_d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJpGTsqAY8r5ugEER4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx6nqJqlSmko_fbKsl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxf-_0Kgl2aNP40xbV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzgmVOntlSBaFZnui14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKelRimneJf9kzhOB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1cN3x8p0pUs6vl4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwF5f5VG_48vzkNDHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwehfMYWI4pLu6Vs0p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]