Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Holy shit..
This is the Big Bang Theory's next spin off..
Hoffsteader on massive…
ytc_UgwR_7Hrf…
G
Haha the layoffs were never about AI. Companies overhired during the pandemic an…
ytc_Ugx8AETD9…
G
Sorry I'm late! It's only been five months since this vid-post. All this talk ab…
ytc_Ugz0E-j8K…
G
Even today AI is capable to write code in computer language, if a program is bro…
ytc_UgzF1dl1M…
G
So, why are we trying to contain the AI? Can we not work with them instead of be…
ytc_UgwiETjJU…
G
@flickwtchrplease elaborate. Where do you think AI is more efficient. For examp…
ytr_UgxgCU7bY…
G
Even if all vehicles were controlled by computers (which may or may not be contr…
ytc_Ugil9BPZ0…
G
My little brother steals my tablet goes threw my tablet and goes through my ai c…
ytc_Ugywy6igN…
Comment
if handful or fistful of companies are going to lead AI and rule the world, thats fine. But when you say that people will lose their jobs to AI, thats understandable. But in that who is going to buy products or services from these companies when we don't have a job to do. when consumer spend reduces or at some point become so negligible these large companies are just going to lose all these people who either use their product or services.
Only ultimate thing is a large chunk of people become slaves or starve to death. But factorially these large companies aren't going to make any money if there are no people around to buy or use their services
youtube
AI Governance
2025-07-29T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgySDD94WgTRCLyakN14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxkR3kTHdI4LsChX9B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1o6lamKzTB2M4lsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY_lhQpLz_fUas9VJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxsRYzyLRdTw3XyLBR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwfQT58gbuqjqSor6Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyfJ0HPszl5e_mMI8R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwBDyfSwqrGJzrw1SV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugx3mpfI-VZPWt5BxGB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqG46PwMdqB1v0nhJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]