Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The current self-driving car behavior is "If I don't know what to do, stop."
H…
rdc_crxm1nr
G
If an AI is programmed to say it is not sentient and to even argue when you say …
ytc_UgzBVY7GF…
G
If you believe in God, and that there will be an antichrist, then it would be un…
ytc_UgxPNkxi4…
G
Read Bill Joy's Wired article "Why the Future Doesn't Need Us" this is the guy w…
ytc_Ugh2_dcFz…
G
43:16 blockchain decentralized technology. Iotex for automation, tracking and de…
ytc_Ugz02NbzB…
G
I mean idrc, ik that can offend people who do ofc, but if you want to get images…
ytc_UgyBJlnwP…
G
disabilities are also part of being human, art made by someone colorblind seems …
ytr_UgxSa5H6p…
G
We are "trusting" self driving cars because humans are a greater source of error…
ytr_UgweGjhaD…
Comment
I don’t trust the government with everything but I trust that it’s not going to let corporate America take advantage of them and I don’t ever see an America where there is only a rich and a poor class. We as citizens wouldn’t allow it. The public has already stormed the capital once, recent attempts and assassinations. We’re craving justice and civil war now days. It’s 2026 and the general public owns automatic weapons, tanks and tannerite. I’m not saying we can take on our own government but I’m definitely saying we could destroy AI and the geeks behind it if we ever needed to.
youtube
Cross-Cultural
2026-03-20T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxec3av6iTuxfOdwaV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz5T4krNw7eUs87HSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyyJjaKMHn49ziTfu94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSdKuP--kZ1lQlY-h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxr0w8-cuHmXa4P--V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw54__SrI2okjNPot14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyQo6qY22klM_asYCt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwbV34LiRizsHHJZKt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzyqncV0eEBbMx7L6N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzJ1JHC0fLrqlTOOLx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]