Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bombing data centers and killing all AI executives and investors is our moral du…
ytc_UgylAYoqD…
G
The problems with AI go beyond government regulation and UBI goes a long way tow…
ytc_UgwAY6lUJ…
G
Exactly ... smart people walk away when they're the smartest guy in the room. Wh…
ytc_UgyXJULcF…
G
1:19 HOLY SHIT THIS IS WHAT MY CHATGPT SAID “what would happen if we found out w…
ytc_Ugx72qeGT…
G
@eeshanm9565 humans are the prey. Not AI. AI will quickly learn humans are defe…
ytr_Ugz5ZTX3B…
G
Trust me if the AI models had a shred of liability bad software would never be a…
ytc_Ugy5ubYii…
G
So you pulled a lefty lunatic who doesn't think IQ can be measured to talk about…
ytc_Ugxx1Q6xM…
G
He was good for the Elon thing, but turned into an asshole for the AI art one…
ytc_UgwGixjOg…
Comment
Damnnnnn... a certainty that AI will simulate human lives in the future. At some point it will simulate humans in the trillions upon trillions. So the chances of living in a simulation are trillions upon trillions. If you dont have existential dread then you do now! It explains why we dont see intelligent life outside of earth. Why bother to simulate that if earth, or the human on earth, is the target of the simulation.
youtube
AI Governance
2025-09-04T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugymn3lzvwCAMs5QlLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFgWj_2iXnyVnbqFJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyG5xJCoGoSsxaQ_ft4AaABAg","responsibility":"society","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyc9jhn2fmLabyPN494AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzgWKNfZM0alyEcUyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5hFuUpaeMgVaFm8p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgxdhTHMZ_D4gGC0ZeR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYjzb6N0pqd9UG8vV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxgRQUqBsqIN8t6njV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz82fAD0PPNaZrqj494AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"}
]