Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One thing not talked about is that AI runs on electricity. What happens when the…
ytc_UgzOhiqZh…
G
What some Ai-lovers (not all thx god) who love to spit on worried artists online…
ytc_Ugy8Al5EH…
G
Even then, "a bad worker blames the tools" or something like that. If anyone ca…
ytr_UgyZKdHOo…
G
I will never take a car without a driver. I think this is pushing it. Also it's …
ytc_Ugztc7Ikr…
G
Well hopefully if AI is smarter than all of us, and decides it doesn't need us a…
ytc_Ugyypf6tu…
G
The people who are screwed are the people producing this podcast, because the pe…
ytc_Ugxcl0oJ7…
G
Given that AI just follows algorithms, if no developer programmed it to self pro…
ytc_UgwW-SFU5…
G
This is exactly what ai should be used for instead of brainless youtube videos, …
ytc_Ugy3EgPej…
Comment
Yes, AI + robotics will eliminate any useful purpose we have. An authoritarian version of this world would be the perfect setting for the anti-christ. People not serving any useful purpose + dependency on the state for sustenance is a recipe for authoritarianism.
youtube
AI Governance
2023-11-03T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyswzigu8sEkt4dCB54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyVQdig1SAvUWov8b54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDEwSyrtQROHqMTVd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz2rnS5_nx2DVOG5Vh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmPa7xp6mW9Qe35Ot4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzeiUgJuhs0VOxi8yh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzgoGsuh_LPfipT0Xp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwci-9RF93aTXynL0J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwoz_PBgkGSM-C1baV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwVm2hnJoey_pRjPul4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]