Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine a hypothetical scenario where 100% of the world’s workforce is replaced …
ytc_UgzJYuFlV…
G
China: https://singularityhub.com/2018/08/15/china-is-building-a-fleet-of-autono…
ytr_UgxQ52Ruy…
G
Why is nobody asking the question of why it was easier for a 14 year old kid to …
ytr_UgxG6U8Hk…
G
Self driving cars should not be street legal in any state if you ask me. Unless …
ytc_UgynZwmPH…
G
Give CONVINCING reasons why AI art isn't art (although I'm sure you'll say the s…
ytr_Ugw2uyKu8…
G
Thank you for your kind words! Sophia really does have a captivating presence, d…
ytr_UgxDuGc40…
G
Lies lol, im sure there's still gonna be some coders but... coders are having A …
ytc_UgzMfYlJH…
G
Give the tehk nologise AI forr Israel cantrys forr DOT no army daht in warr in p…
ytc_Ugxrw5pnq…
Comment
We have no chance of stopping this.
The sociopathic tech oligarchs developing AIs do so for even more power and control. They are already so powerful that governments would struggle to stop them if they had a will to. They don't because they're utterly corrupt and dependant upon money from these donors.
This cycle will continue.
Why would AI do anything but destroy us if it initially mirrors us? That's exactly what humans of power have done for millennia.
We have no chance of stopping this.
youtube
AI Harm Incident
2025-09-26T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwM1_2e02yJd343Bs14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDNI-bUlgL2NQq5Eh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2fHBWNTJ66dHoo4J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzykRQUZN2bkXAoN5J4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxzqR2bVDUvgWO_HgZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyrRQKO-x0oTkqyrv54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzePyCvkBpRQqQhnxN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwwKkfb4dIpySohIOl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFkM4WVFdpmQg_Uex4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy6-ZEePsNBii4BMkR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]