Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
2:22 not taught to code? It certainly was as it was fed programming language do…
ytc_Ugwn33U0i…
G
>I wonder what would happen if Eg. Disney built their own AI, trained it only…
rdc_jwv8cc3
G
"ooo if ai takes our jobs then we won't have money!!! that's bad!!!" consider tr…
ytc_UgyGFSPYb…
G
Ai bros calling themselves artists is like someone putting a Hungry Man in the m…
ytc_Ugx06-jiy…
G
@_WhyIsEveryHandleTaken.actually a lot of AIs can draw hands perfectly like PixA…
ytr_UgzGmf_En…
G
Nothing new automation of one sort or another is inevitable as robots don't get …
ytc_UgyuRwkTB…
G
@filipes.5354 I honestly don't see any connection between what I claimed Musk ha…
ytr_UgyJqP6dj…
G
😩 FINALLY, a sentient AI will finally hold whiteness accountable for all the opp…
ytc_UgzTNgHci…
Comment
Excellent series, thank you for your efforts to educate the masses. I have one question that maybe you have asked one of your guests - if not, maybe you could ask a future guest... It seems obvious that the overwhelming emotion driving AI development is greed. Given that, and the mass unemployment AI will create, exactly where do these CEOs think customers are going to get the money to buy their wares? Are THEY willing to pay 98% income to pay for a liveable universal minimum income? Or is it simply that they are in a race to become God?
youtube
AI Governance
2026-01-16T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyucO0EmbukKCVopbJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzK-CDpgcMCv3uEIth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2YS_OmhrgLtps6t54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyIt_lS-CDUo7GvzOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzAd0qFFjd7jDBfw-V4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzWtZQTH4P1_EUConl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwGGKKJ-Eg1kTEP9CZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6rZDrh_Fjqf1LBW94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymjRZDVarjv46TCnV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzv9e8yFr4Xj_Lww614AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]