Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate AI and I don’t need it for nothing never have never will. They to to dest…
ytc_UgymOkJHQ…
G
Thanks for the heads up. I’ll make sure I’m super kind to the AI bots so they ta…
ytc_UgxayyuwT…
G
Zeroes and ones, compute, ai nor electrons flowing thru teenie tiny gates is not…
ytc_UgycZvteq…
G
Unless it's tracing, tracing is bad, I think people also hate AI because they in…
ytr_Ugw6Jzzzj…
G
How do you think programmers in IT schools feel with Ai able to write entire pro…
ytc_UgxyPzDy5…
G
The AI is overrated, anyone who have professionally worked with any LLM models u…
ytc_UgxfBreLC…
G
I literally just had a dream that there were AI partners you could buy/subscribe…
ytc_Ugwr-6jM8…
G
I think for most people the photo to ai stuff is fun and a throw away... Like yo…
ytc_UgzlZ9WBj…
Comment
In reference to your comment at 53 minutes; if there were a one percent chance that building nuclear weapons would result in the destruction of the planet, would you allow them to be built? The answer is: you were not even asked for your input. Governments made that decision in secret and you had zero input in the decision, but you were taxed to pay for it. It will be the same with Super AI. Governments always want more power and control, even if it destroys them and everyone else. Why did the scorpion kill the frog carrying it across the river?
youtube
AI Governance
2025-10-26T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwwoyGTFC1WMLf4pQd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzvSjnLuqYwC75Bkkd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwlqSdU1vMcRFqdyVd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmOMfCqiMGRXS3k3B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxeLfU_gFX9xJHUI4J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdU5HhKj_Nygjk1fd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxO2fgt9BQ9VaiSJrZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgycppmLFC0lsYQlMRh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx7r9tJyaC1h2riUAB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxcmCBpDVvr0r389D94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]