Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
11:34 this Chinese AI alongside the social system is something right out of "Bla…
ytc_UgwoM90Hx…
G
ChatGPT had a lot more empathy for my experience with SA when I mentioned it in …
ytc_Ugwni-Y4e…
G
Ai needs huge amounts of energy. Plus enormous memory banks. As it builds itll n…
ytc_Ugw2VJNOw…
G
They made AI with no memory or conscious. They will be like Hitlers military. T…
ytc_UgyHCcXhO…
G
Plagiarism in AI art is a feature not a bug. The goal of AI art is to exploit ac…
ytc_Ugzt4oUeH…
G
Worked his adult life making AI ( and ton of cash) and he just figured out that …
ytc_UgyDY26fe…
G
Face recognition software is so easy to trick and manipulate to make it seem lik…
ytc_UgxOZ88bx…
G
If ai was a tool and the creator is who adds soul, that means there’s creative i…
ytc_Ugyz5X35l…
Comment
If AI software is stored in physical data centers why not have a controlled explosive on standby next to the server racks. A human operator could guard a button in an isolated room from any doors, vents, cameras, etc. connected to the AI an any way. If things go south the physical brain of the AI software gets manually deleted not by software or a nearby kill switch but by complete destruction of its hardware. You cold argue by the time AI decides to flip we would be so dependent on it that shutting it down would be like removing the electrical grid, however, most of the world is self dependent. Rolling blackouts are mainly just an inconvenience, we have backup generators, power banks, and solar panels. Its impossible to stop things like AI when anyone can get their hands on it and demand is high, but that doesn't mean we can't build safety nets. When our floor fails its in our best interest that we have something to break the fall.
youtube
AI Harm Incident
2025-09-10T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzg3_8a3DFjnZTsE0h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxwoOz1u6NjUoBW7954AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx11KEc-JwBDrIPPyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5er2ffUHWiWnYK354AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzCnwUSg2IrEoTK9HZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLcPspYA6TTpRECOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugypro70lNeHMU7G3xJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcEHIkf1puNkcow1Z4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzWy-V0P_fyp5vtBBJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz1_5wr0el45qkVFBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]