Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
3:48 I figured that AI had a use in some way but couldn’t think of any. After yo…
ytc_Ugxp5vl7u…
G
So we have birth rate plummeting but wait a minute AI's gonna replace billions o…
ytc_UgwuvMeYz…
G
Great question! Sophia's design, including her humanoid features, aims to make i…
ytr_UgxOPn7mj…
G
We really need to stop saying AI, it's AIP (Advanced Information Processing).
Wa…
ytc_Ugy1N7gCQ…
G
Those who create AI should personally hand out the pink slips to those who loose…
ytc_UgwprBFUZ…
G
If AI takes away all of them, every person would live without ever needing to wo…
ytc_Ugzk33O6n…
G
I am still going to vote for democrats, however, I do not agree with their stanc…
rdc_oi3uh5x
G
Humanity barely looks after its own kind, so how can anyone expect an AI to keep…
ytr_UgzWYN9-y…
Comment
Roman Yampolskiy says that 99% of jobs will be done by AI and humanoid robots, which sounds impressive… until you remember we live on a planet with some pretty inflexible physical limits. Are we really going to flood the world with androids without worrying about where the energy will come from, how we’ll cool the data centers, or how to mine the rare materials needed to build their titanium bones? In these futuristic visions, logistics, ecology, and physics seem to take a back seat, but maybe AGI will invent magical energy and infinite batteries too?
And if by some technological miracle we do manage to create an intelligence that surpasses us in everything: from sweeping the floor to designing its own improved version, then it might ask a pretty logical question: why keep the humans around?
youtube
AI Governance
2025-09-04T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw5b3Hhb1nVyCTnh4t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxstSypKtLzIKE40IV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxTwKnbM8krN4luia14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMxi4MOUKum6vtQJ14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzhLMdV5vV8QNd0vuV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwRkEdAWL4iQ7NTVpJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzp5-Q1tm3c8k-ocBN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwc5NgHOMDvWPICXcB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwAeeELcIBOGJ4c9Kp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyfiNQfYdYaPu4adkV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]