Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@niemi5858 Tesla addresses autonomous vehicle liability by not actually having a…
ytr_Ugz-AoU5m…
G
last one caught me off guard 💀 were cooked tbh ai is getting better and better 💀…
ytc_Ugz0O2RrE…
G
Except that this makes that 100× worse because cops then assume your guilt when …
ytr_Ugz-z8HKn…
G
Could it fit in the "comic book" precedent if one was to make an album of fully …
ytc_UgxLf3zDR…
G
AI can create Genetic specific bio weapons to target anyone or everyone it wants…
ytc_Ugw1wD2Qn…
G
AI would never be able to transfer the soul of an artist into the art it creates…
ytc_UgwSa-A0C…
G
Regardless of how we feel about AI and it's impact on humanity I think that it's…
ytc_Ugyvl7IRb…
G
Well they see how the Witcher went and decided AI can probably copy the books/ga…
ytc_UgylZTgbO…
Comment
How would it be useful, from the perspective of an artificial superintelligence, to do such a thing? We, fragile, limited, and slow biological beings, will eventually appear completely useless to it. Why waste resources sending them to colonize other worlds when the ASI can do it itself much more efficiently with its robot armies?
It is anthropomorphism to imagine that superintelligence will have any desire to preserve biological life. Unless we are able to implant such goals into these systems (which we are absolutely incapable of doing—this is the alignment problem, still insoluble to this day), the future superintelligence will act without concern for what happens to biological life. This is why all these experts are worried about this existential risk.
youtube
AI Governance
2025-08-02T23:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyK2hhZrQ2ql3FhJsN4AaABAg.ALKMXhUBO5KALLxnph3VpK","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwzpn-jVtylqsoa5qV4AaABAg.ALKIvnU39j9ALKKw4NFO3y","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwzpn-jVtylqsoa5qV4AaABAg.ALKIvnU39j9ALKNPayW6Ec","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgwIYUnePCIQjdVCpDZ4AaABAg.ALK8JHI8rlJALMryZLsWpN","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwIYUnePCIQjdVCpDZ4AaABAg.ALK8JHI8rlJALNZUPOcUZ4","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwIYUnePCIQjdVCpDZ4AaABAg.ALK8JHI8rlJALNt3QGfmEm","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwIYUnePCIQjdVCpDZ4AaABAg.ALK8JHI8rlJALO2ild6oOl","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy5Hm4fEomuU6ka2bh4AaABAg.ALK6anP6VfoALKERqBobld","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwYprA4Q2XMnF1eMo54AaABAg.ALK60U2KXOnALLFF6RERvn","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxotUzsDNFALOvVsxN4AaABAg.ALK2q2KGiqMARksq02IW7i","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]