Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Obsessed with efficiency that it is a blinker for human needs?...
You know what …
ytc_UgzzeMXZ5…
G
So does that mean that when self driving cars are going too fast, Google gets th…
rdc_czxnkv9
G
This is the dumbest video I've ever seen.
Yes UBI its "an excuse for companies …
ytc_UgyndKB2S…
G
i agree. These trucks arent set to be released for sale to companies for at leas…
ytr_Ugj0GNvXA…
G
Fr like I hate when the ai bros say “Well this is like photography!!!” Like no? …
ytr_UgxoDuh5c…
G
I may not have a complete understanding of what this is, and more than likely th…
rdc_icifnqo
G
“B-B-But human art has too many mistakes!!” So does AI art! There are going to b…
ytc_UgzMR0AYI…
G
THE ROBOT THAT LOOKED AT THE CAMERA LOOKED LIKE SHE WAS GONNA KILL YOU OMG…
ytc_UgzyF35BY…
Comment
For now I’m more worried about human making something we will regret using AI than AI making on his own something we will regret. AI don’t need to do shit, to hurt human as human is already doing a pretty good job at doing that.
All AI we are using now are purely stateless and only have an illusion of reasoning or memory. As long we are using Transformers architecture, human is still the most dangerous creature on this planet with the potential to erase the human race from the surface of the earth.
youtube
AI Governance
2025-09-09T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwjrtwIaNfUlo_xNlJ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwc338fNtkLV2PgfsV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZKSiKKqtovTV6COF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzLZZFpMyMv43uBto54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxirJ1Ww3V3oY5uRV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxNeEAChN9rMPChuq14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwwDp5Yu8umx_c7tM54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDRG90NIPjdEMaMP94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyMSyepzaAZjtqqm8h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwMCg-aPo0hSI37RKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]