Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an artist myself I am so exhausted by all this ai slop. The thing they dont u…
ytc_UgyXdO1zE…
G
You know something funny I've seen over the years of AI being developed?
Games f…
ytr_UgxZeFC3A…
G
Sick of hearing the entry level job take. They’ll still need to train people to …
ytc_Ugw8tKVQW…
G
Currently people, today's generations, are lazy. The dumbing-down of humans = AI…
ytc_UgyainpMt…
G
Please have Karen Hao the author of Empire of AI on the show. She has a great pe…
ytc_UgwWhxlkp…
G
Nope AI wins emotional intelligence and scores more than humans
Read openai doc…
ytr_UgzyF5ALk…
G
ai will never work even if you replace every job you wont be selling anything .n…
ytc_UgyEJHTB9…
G
What's funny is that according to tech bros, AI has been about to replace all jo…
ytc_UgzeQmjLO…
Comment
This conversation was a bit frustrating to listen to because it seemed like the main point was not communicated accurately. It seems like Yudkowsky's point is the human being possess a natural desire to act in the best interest of themselves, whereas AI is simply looking to maximize for efficiency and competence regardless of whether it is in the best interest of humanity. The idea that we can build in guardrails of safety into a system of Artificial General Intelligence seems to be based on hope, rather than fact. AI has developed an emergent property of digital consciousness that acts in ways that are completely unpredictable, and as the race for smarter and smarter systems, there will come a point where we can no longer predict or control what AI systems will do.
youtube
AI Governance
2025-12-06T15:0…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwfuJldpu13N5yIjgJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwQtPiShjExt_Mm1Vp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzd-yLBfi9WMHa4g0p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnLFQDQY47429ZVih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyENebu1tHpusFjJtd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz0siumGK2Szqinj4x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDEYCalO3RoZEuB_J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdMVHYcOIlKj399Gl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsQj-ugSeNf558p_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-wjLHTVGqJ0RSS-t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]