Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the entire white collar class gets replaced by AI, who is gonna pay the plumb…
ytc_UgysI1xPQ…
G
AI doesn't seem scarry at all. What scares me are people who will abuse it.
Hac…
ytc_UgwIB0UmP…
G
Sickening that so many smart people are caught up in the research and developmen…
ytc_UgwfJlUnm…
G
my biggest issue with AI is that it's soooooo so boring. There's never a story b…
ytc_UgzczsJ0F…
G
I would like to clarify something, StableDiffusion has no less scaling problems …
ytc_UgwPM67J0…
G
If computers are thinking, they think at exponentially greater speed than humans…
ytc_UgwwGucn6…
G
Why are we calling AI super intelligence AI is still just human intelligence. It…
ytc_UgyKo5VOj…
G
AI is rapidly replacing coding and software development jobs, coupled with outso…
ytc_Ugzj-kf0w…
Comment
After seeing your other content about pseudoscience I thought of you as a keen skeptic, which is why I'm a little disappointed you didn't examine the alternative case at all in this video. More specifically, you never considered the possibility that these AI CEOs might be lying about their capabilities and how close they are to ASI. They have plently of reasons to do so. They want investments, and the more important they can make their companies seem, the more those investments pour in. ARE we really going to be able to make ASI just by throwing more data at our current systems? It seems you have not considered it at all, since the consequences of ASI being created are existential. But this is just Pascal's Wager in new clothing.
youtube
AI Governance
2025-08-26T17:2…
♥ 159
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[{"id":"ytc_UgwLPbYl3CxGhro1c-d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxcc0ye11wLc5xsw714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVmjCEoB51o4NqDZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwBvlYr2UcOhyTnDKF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyb0g03augQBDBLU-p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}]