Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This AI crap is the worst invention of the 21st century by a long shot. I hate e…
ytc_Ugz54ZzV7…
G
AI's effect on entry-level jobs is scary, but AICarma's been a win for ensuring …
ytc_UgxA8y50f…
G
Jeez reminds me that similar things happened to me with both OCS I done hs both …
ytc_UgwBv55r-…
G
@tomas.bednar It will never "wake up", these algorithms are fundamentally differ…
ytr_UgwvQFgHn…
G
Honestly the industry most threatened by AI is actors and singers. Especially s…
ytc_UgxwwditO…
G
Do we know for a fact it's AI? I know a lot of people are quick to yell "it's AI…
ytc_UgzDxuzEm…
G
THEY WILL KILL US ALL...
Most of us see the terrifying killer robot movies and…
ytc_Ugx7Sim-D…
G
It's more or less just a religious saying. The iris is the deity of the robot's …
ytr_UgjtGdcli…
Comment
If a being smarter than the most intelligent human, or even more intelligent than all humans combined, wanted to do this (directly or indirectly destroy humans), there would be no way to stop it. It's impossible for governments to oversee and control AI research and development, even at this stage. Millions of GPUs and AI chips are working on this task, legally or illegally, around the world. Large corporations are in a frantic race, with the first company to develop general AI poised to make trillions of dollars. Given humanity's greed and recklessness, we understand there's no way to prevent this crisis.
youtube
AI Governance
2025-11-11T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx3QMz3HszogANDzFR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPFY3oVP27dYKhdGF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyeyXY_OZ4K7g_0AQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWsAF3eKpjq34CQ2B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxh0XZAyIyKB2hiiQ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzyNaumB1bWFAsXc-J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzosjqyMQlTpegMjN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyTZsI2whXr4ooW-b14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgMIoY-_iTDi9BsMR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwYi6Yxdsslf4d14ql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}
]