Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good thing it cause of AI and not that 70 billion dollar reduction year over yea…
ytc_UgzzP6tUZ…
G
Lol
No tech company can make any $ , if people don’t have jobs & spending power …
ytr_UgwETB4fe…
G
AI sure, but also prepping for a recession while wanting to project profitabilit…
ytc_Ugy3dW31V…
G
Any thing that man make, can definitely break!!! I wouldn't be surprised if them…
ytc_UgzNSHMja…
G
Anybody else hear the Male looking robot say 'good riddance' when the female rob…
ytc_UgwZm8cII…
G
He’s a smart guy.. yet he contradicts himself. Doom and gloom.. AI will leave us…
ytc_Ugzac_oes…
G
If you can’t separate talking to bots from humans, you shouldn’t be using bots.
…
ytc_UgzCTh8Tw…
G
Geoffrey Hinton approaches intelligence from a materialist perspective, unlike m…
ytc_UgyATccl6…
Comment
By my calculation the moment we make AGI we get extinct.
And there is no chance to stop, game theory makes that any frontier actor will have a choice on gabling between world domination and human extinction, and world domination race will be too enticing to discard from up to 80% chance of extinction, only if chance of benign AI is lower than 20% we would have a slim chance of coordination to stop since this would hit biological threshold of gambling aversion, otherwise some actor will continue until futher consequence and extinction.
youtube
AI Moral Status
2025-12-28T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxnPCumyxMKK3717b14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaDvAX7AdH6u3IcaB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxxWTCQBKonLdaCkit4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXq2SejcE0YW9gOMR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYpc5lxwLzREOnoFV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyHRmya9YFRj9hM3WV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwNqOeHWmPWJprdE_B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJqb9bi9dvALKdQjl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwaO307HSeTYj5B0PV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOlKg3eC76vmMR3Q54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]