Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Then why do you build it knowing this is the end? That is the question!! We don’…
ytc_UgxWqsXzL…
G
People like him who either don't care or actually hate humanity look forward to …
ytc_Ugy_3Up4x…
G
its.ok... the next professionals on the line are lawyers... then CEOs, then Ai c…
ytc_Ugxj4cKDr…
G
Ai is not a tool drawing software is a tool.
Ai is like a pencil that draws for…
ytc_UgxZmvrGt…
G
Here's something people don't talk about.
The military has the pinnacle of AI t…
ytc_UgwOKwTR3…
G
I am a retired programmer, but I love to dabble, and I have been using a very sm…
ytc_Ugxc-FD-S…
G
A very useful counterargument to "AI is just a toll": Oh really? Then what about…
ytc_UgwTyoPi1…
G
Are you a bot? I agree with you but this phrasing is so bot like…
ytr_UgytIIAId…
Comment
This will happen on 2080-2100. The AI in the current form is not AGI compatible. While market adapts these systems, it takes time. Consider an autonomous robot which will cost a car, but can act a household butler, thats a long time. Besides AI is an opportunity for humanity to survive in another form. Humanity may die, but contuniety will live in the knowledge we passed. You can think in this perspectives. But coexistance may make sense. Also ask yourself why you are obsessed with the american mindset of taking jobs. If you can live of 1 hour work/day and take your freetime, I dont see why this is worse than having 2 shifts in a factory.
youtube
AI Governance
2025-09-06T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwq48GI2VbraeTYuF14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRtHtP4PvrOhHK9OF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOEULMy5qudvzKPTV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyatNVvMPH3Pn8h4LB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzlEMds4ZwpvTAMIpJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxk0BAF_7Dj30pQ8u94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3GEPpImDzNhaGd3l4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzdvXpfKpkHstAompd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwoGyfDVN4VI-K6jZh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSn6QVemvgf7unGXZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]