Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do organics gets rights, and non-organics don't? I'll answer the question fo…
ytr_UgjLYJhHP…
G
I think it's more accurate to say that he was saying capitalism isn't sustainabl…
ytr_Ugz-EJN-G…
G
And we're trusting Acid Mcgee to build ai?? Why the fuck is he such a hippy... I…
ytc_Ugw83Fyx2…
G
14:58 did chatGPT just shudder there for a moment? Like involuntary stimulis in …
ytc_UgyV3Yzou…
G
Karen Hao is incredible in the way she breaks down these larger concepts for the…
ytc_UgzEGj9CH…
G
The premise of a world in which there are no jobs because of AI seems quite elem…
ytc_UgyuUQG9P…
G
Ok but once they fix the legality of how they build their datasets, AI would be …
ytc_Ugy1M3WP6…
G
Again, if you don't want your art incorporated into ai art generators, stop post…
ytc_Ugyt_XLEw…
Comment
“The more compute you add, the smarter it gets” — oversimplifies a messy reality. Sure, scaling compute and data can improve performance, but it’s not linear, guaranteed, or equivalent to “intelligence.” There are diminishing returns, scaling laws, and architectural bottlenecks.
“Alien intelligence” — that’s just drama. Models are statistical pattern machines shaped by human-made data and architectures, not mysterious entities with alien goals. They don’t “wake up” just because you plug in more GPUs.
What he’s doing is packaging uncertainty and complexity into a digestible doomsday pitch. It’s an easy way to grab attention (“only 5 jobs left by 2030!”), but it rides on exaggerating AI as magical rather than mechanical.
youtube
AI Governance
2025-09-04T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxrBrXu90G8WfDkW6F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMFip-m-SoPj9IJvV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2N4j3Fm1fMaQG3Xp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_HgtOjilI4uzrC8R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyd9LnV5jNUb0nUhxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIpY-WIkb63NFZl014AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzd_Xy3wIPYplKSbOl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxhxt0GTSCV8_dYfjp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzn1HTUGw_VQuQ3msB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyifh9v-gQEUP6V2wd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]