Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
52:58 The host describes how if there was a 1% chance that he got in a car that …
ytc_Ugz41kDRG…
G
@steamboatwill3.367 I agree with this video that it's use is not to be on every…
ytr_Ugx6S8VqC…
G
Ironically I use an AI sharpness filter plugin which makes it a lot easier to te…
ytc_UgzV2MVda…
G
The fact that it's literally safer for women to trust a self driving car over a …
ytc_Ugx64LBJg…
G
I'm sorry, but the government and civilians are not going to let truck drivers k…
ytc_Ugzu4bDn9…
G
AI chatbots have shown abherant behaviours, caused people to kill themselves, wo…
ytc_UgzHab0iv…
G
Tip of the iceberg. No truck drivers means no truck stops and no jobs for anyone…
ytc_UgzNiYqzj…
G
Its been said that ai will be humanity last invention and these billionaires are…
ytc_Ugwvy0yDb…
Comment
1950s — Nuclear annihilation (Cold War fears)
1960s — Missile crisis & communist expansion
1970s — Overpopulation & resource collapse
1980s — Nuclear war escalation & environmental threats
1990s — Crime waves & Y2K collapse
2000s — Global terrorism
2010s — Climate collapse & global instability
2020s — Pandemic & artificial intelligence fears
Hmmmmm. A society in fear = a compliment society = a society that generally sees the government as a potential necessity (so nobody will really feel confident in the idea of overthrowing or challenging the entire government. BECAUSE - EVERYONE - SO - BUSY - THINKING - ABOUT- IMPENDING-NONSENSE
youtube
AI Governance
2026-03-30T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxNngpzvlmoRsOWrnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwLMkER-lkYjrrp7yF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxNG4cAC3mORElpNj54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwED9KricIAGGO1qDp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugzn4WCcvlKv4jpjab54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwmEyWsObVRtGMM0bx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzlccIz6vK7aFrNf3J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxjdl0NlaaLlMwkfV14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzIVvK1exYgvem-Vpt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6PQjYL3RWe12Qex94AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]