Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes technically, but think about it…
If ai art never existed, the moderators wou…
ytr_UgzxsNfH2…
G
The framing of this video misrepresents the nature of language in human-computer…
ytc_UgzDO3mNQ…
G
Mayo Clinic just found a medical breakthrough using AI…
AI is going to take all…
ytc_Ugxblq_wj…
G
Guys, you need to take a chill pill.
AI can't think, feel, or do anything apar…
ytc_UgxbmHGeo…
G
Yeah you got no shame.
Also, generative AI inbreeding is an issue. The more the…
ytr_UgxxhNVEg…
G
Hello! Imagine everyone standing in line, in front of the robot workshop, for fl…
ytc_UgwFNHlvO…
G
What ai thinks the last day of earth will look like..
*shows scariest shit ever…
ytc_UgzAYSFPA…
G
The point is really there is absolutely NO explanation for what is going to happ…
ytc_UgwWZmg1k…
Comment
The Singularity is INEVITABLE — and HUMAN EXTINCTION becomes almost certainly inevitable once it happens. Exponential compute + global AI infrastructure make superintelligence a compounding force. Build ONE singularity and you unlock ENDLESS an endless stream of singularities, each faster and more powerful with their own agenda. At that scale, survival becomes a numbers game: you only need ONE whose objectives exclude us — and we’re done. Humans cannot meaningfully control a godlike intelligence they cannot comprehend. Why is no one talking about this?
youtube
AI Governance
2026-02-19T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxKJBAO8_Soc1zo25t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzegE7zcJo78G8fY0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzhjf-WG0vK2wCwXPt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXbqNCUjPMnxBr9P14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNhiZYXbdEQ0Fz_ot4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxBHLWGqG7Uek-Z6tB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyI4tjWV8oDqadKkDJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNj9g9HpAVel8VnHp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJU4nFRJjOC9Eo-iJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw9cAQMw1LTegDq3Ld4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]