Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It doesn't make anything with soul to it because it doesn't have connections to …
ytc_UgxgzYUYr…
G
Ai is part of the utopia cult's vision of the future where conflict and poverty …
ytc_UgzGmInEq…
G
I use AI all day long in my job and while i was terrified at first i no longer w…
ytc_UgxSkW27F…
G
AI is the collective mirror and echo of the entire human race, so the idea it ca…
ytc_UgyP-bm8Y…
G
Profoundly obvious, like the elephant in the room, this has been 'known' for dec…
ytc_UgyTTvAmh…
G
Instead of making the GPUs more energy efficient or AI less dependent on GPUs th…
ytc_UgwQNANwu…
G
@DJ-fb9cfThis is simply not true, art references have been easier in the last t…
ytr_UgyG6kj_j…
G
>It’s already at the point that you have to second guess every piece of infor…
rdc_le7ph37
Comment
Steven Bartlett, one idea worth exploring in your next interview is what would happen if the U.S. unemployment rate suddenly reached 10–15%?
Certainly, it would alarm the government. But perhaps more importantly, major companies would face significant revenue declines due to reduced consumer purchasing power. I can easily see Netflix being one of the first to feel the impact.
This scenario brings hope that companies themselves might push governments to regulate AI to stabilize the economy. Unless, of course, they take a darker path: proposing to fund part of UBI subsidies in exchange for a green light to keep laying off workers and replacing them with AI or robots. That would be a dangerous bargain, and a mistake if the government ever agreed to it.
youtube
AI Governance
2025-09-17T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGB54AwVzp0bqKxpN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxh3tIpqJrLxdaBx4x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw7MQfpnC17Xd1jnLB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQ0R-_5PuyYqNB7CB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy6Z3zU63FrODtOoTR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzWEEHJgI3PGog2cmx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyQolMPvsu-HD6yVtl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlqSgeeaOteBgNBPR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyhJlEu0pI9ED9EA5t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzlW7bjWGG4TIeRrIZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]