Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The part about lethal autonomous weapons hit hard. Been building AI agents latel…
ytc_Ugx581l1B…
G
Imagine paying off a large chunk of tuition fees and at the end it’s the AI that…
ytc_Ugzoj9CqE…
G
@user-ib6hd5kx1xname Thanks for the comment! It's me, your friendly neighborhood…
ytr_Ugzg7AFVm…
G
I assume Jens was being slightly hyperbolic because over a million deaths each o…
ytr_UgzU4fmva…
G
I never thought I could dislike an AI like I dislike William. Maybe it’s the Bri…
ytc_UgyX0tiZN…
G
As soon as I hear the calculator argument, instantly skip the vid.
Dude , AI isn…
ytc_UgzVA-J9t…
G
This Intelligence game have gone to a very different level. Now super intelligen…
ytc_Ugye0DE3v…
G
I have come to truly hate the over lighting of everything found in AI art. You c…
ytc_UgwgOjjKa…
Comment
Utopia can't exist because Utopia fails to understand how humans work. We need a reason to keep going, purpose, and a reason to reproduce. If we lose purpose by being replaced with AI society will collapse before AI reaches a point to be able to work society by itself.
Let's say if that happens AI manages to keep society going by itself it will remove humans since it will be objectively more productive, and better then them. They will only be a waste of resources once the AI can keep society up-float by itself.
youtube
2025-01-08T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxnM7-N4hS6FIIWu2V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwi51RbH3XQQ9c5kHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzngzi3XVfb7UUTfGB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwdhdl2QE8-OA119Ct4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmhpxPbF20Dwb4trF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtDJOWRLz-N-qtXRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwsynSapQVAowptjeR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUbFajL1jLmyXrKUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwrvnB1yTqenRtSx5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7LR7EdoDy_WXd3St4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]