Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We assume voters hate any sying in it... but i am afraid it is far beyond that s…
ytr_UgwD_B6lg…
G
Not gonna lie, I use Waymo pretty regularly and love it so much 😅
For context I…
ytc_Ugy7ew20O…
G
chatgpt is so stupid is it really that hard to just show a transparent image…
ytc_Ugx_MEvrw…
G
It’s not the AI, it’s who wrøte it.
It was written specifically to insert “díve…
ytr_UgyB9wmwT…
G
this ai definitely won't become conscious. it's just a fancy typewriter attached…
ytc_UgwZe-JO4…
G
After complaining that the big AI huys don't stop, he gets the opportunity to pu…
ytc_UgzkRZK1X…
G
DeGrass has little understanding of the nature of AI. It's not about AI becomin…
ytc_Ugz2WuX3Z…
G
Computer science professionals and majors seen this coming! AI is so scattered a…
ytc_UgyUpvOss…
Comment
One thing that isn't discussed enough, is that when AI can fully take over software engineering, it can take over anything. Because it can create/update systems on demand while filling in the gaps generally left to human intelligence. That stage is basically the recursive self-improvement stage. Even if the model is not improving itself, it will be able to expand its capabilities independently. There are already many jobs that only exist due to institutional inertia, but how long will that last without bolstering from the government?
Honestly, I don't see how any job survives this. Even CEOs can be replaced by this technology if the board or the shareholders decide they would rather keep that salary for themselves. The only real hope in the current system is if the stock market becomes an efficient wealth distribution mechanism.
youtube
AI Governance
2025-12-30T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxePmJwhHuMWggNQS54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwI2PM-569qkSVAOlJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgygRBB64Gf4gvarMEJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyLwV9EUA0XH6jlIId4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz2p9zViB7CimWjGUF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzzUYBmsVtzDRzCrBp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlS2mnhrakR8O-ByN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_Ugz6xB0jWkolE4-jlX94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw-PyAjE89rOoELSi94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVPRu5tcOp94-o_PB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]