Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So wait… if ai takes over the world, we would be ruled by Robot Nazis?…
ytc_Ugwk8O2CC…
G
I feel like it's important to mention that this analogy tackles not the problem …
ytr_UgwlPLNsf…
G
AI will happen. If we try to ban it, only the compliant will stop. The ones who …
ytc_UgzFJqvAH…
G
The one who are making this ai generator are the biggest company like openai, go…
ytc_UgwZ7qC5Q…
G
Dude just imagine getting your ass beat by a robot. One dressed like a bitch…
ytc_UgyXQkqtr…
G
It's the AI copyright infringement and outright plagerism that is the problem. A…
ytc_UgxZETqjf…
G
This is exactly the same i think when i talk to chatgpt and use words carefully …
ytc_UgyZs_IAy…
G
Robots are big computers they only can do what they are programmed to do. Sophi…
ytc_Ugz_q1lXg…
Comment
It just further clicked for me why Elon wants to establish a Mars colony. If there's such a species-ending crisis with AI launching a series of 'attacks' on humanity, and our number dwindle fast, in a matter of weeks falling to under a billion, then under a million, etc. then the only way of surviving is being literally off world ... for a time. That must be Elon's real motive, human survival. OK, I'm starting to see how dire the AI threat could be.
youtube
AI Governance
2026-04-16T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlZ3jwYjiWRTxHcYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxgjqfoipLlKOdcjWF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxySsOOm9GeARlnWmJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzRW87vofgwxye7pbF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyPl-kPcX7YmVGj7K14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD4C73oSlR-1NkQHV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxK8OBb34KxjMHuWGp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxiBlQBuaydq1HEPpZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfEUfhuE4I8XztJwp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8KRY7kipqHH7--T14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]