Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like the AI art thing that you can put images in and it will just make another…
ytc_UgxisEYZj…
G
I hear you, but this isn’t a live recording and technically, AI can already do w…
ytc_Ugw2m1k6y…
G
Or people can stop doing such boring life sapping jobs and let AI do it & people…
ytr_UgxjkVy9I…
G
These people are such jerks talking about the rules we have and the ethical prac…
ytc_UgzijvHPf…
G
🎯 Key Takeaways for quick navigation:
00:00 🚀 *Introduction to ChatGPT and how …
ytc_UgzKc1k3B…
G
Anybody with half a brain that's spoken with any LLM knows they are dumb as hell…
ytc_UgwkAjKSG…
G
Whoa whoa, you give Carlson far too much credit. Carlson is bought and paid for …
rdc_jy0h564
G
Frank Herbert: Ai is dangerous because the Elites, both Noblemen and Oligarchs, …
ytc_Ugymj7Asw…
Comment
So let's say it happens, AI takes over all jobs except for that 1% for some rich people who will perhaps prefer humans for some types of jobs. What happens to the 99% if they are unnecessary and have no income, they certainly won't watch that AI podcast on some topic because they will have the burden of basic worries about how to survive if they are even allowed to. In addition, if they have no job and no income, I start from the premise that there will be no universal income - some kind of social system because the powerful (capitalists) never liked that, for whom will it be produced and created, can such a system be maintained, because it contradicts itself?
youtube
AI Governance
2025-10-12T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzSECmRrse_ISGABkp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy7vLUFJZjdBs8h8Wp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGgSBkHpATkvXO2Dl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzBSFAtL69M6Cc_AeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwUfBNgLJUuDb0zSUd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzDT4WjRO-NqE0CqN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwNZ5yopjg1oCls2pl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgybmJ4bqTWlCoLT-IR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHALSkzFaET7uvr3d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyyuLAyzZZNkcalioR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]