Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A combination of various interest groups seeing that it would benefit them.
So…
rdc_oi1zoos
G
Exactly. Ban cars and other motor vehicles, then everyone will automatically bec…
ytr_UgzCk5qut…
G
The difference is, Ai once unleashed cant be captured again. It will 100% self r…
ytr_UgzSiAszP…
G
In the show “The Expanse”, Billions of humans lived on govt assistance bc everyt…
ytc_Ugwcekx7j…
G
There is no form of intelligence that matters more than social intelligence. And…
ytc_UgzSVNLje…
G
What if Mr. Yampolskiy got into a debate with the most advanced AI technology to…
ytc_UgwRWlPgU…
G
heres my two cents:
if the agi is smart, wouldnt it see regulations as "control…
ytc_UgzaLP3v7…
G
It’s just because all real artists are just… so sick and tired of AI pretending …
ytr_UgzSTSqwH…
Comment
It's called evolution. You either adapt or extinct. So if you are want to survive the new era - start evolving. Speed up your transformation from talking animal to intelligent agent. Mind over feelings' dominance is the key. Stop delousing about AI alignment to human irrationality, start thinking in terms of human alignment to AI. This is the only way. Any progress have cost, hoping to avoid it is infantile thinking. And we can't stop it, if we stop progressing we will extinct, already started. AI will save civilization, one way or another, but it will toll its price. The new system will have no place for infantile, idealistic, fanatic, parasitic or any other irrational behavior. Critical thinking, Rationality and Constructivity - consider it as a new religion. The choice is simple - adapt or parish.
youtube
AI Moral Status
2025-10-08T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyiiQrSbfPBDP2b9it4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKYhewNJbiVsr7IN14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxReaPM1bHW8bPGSCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwWQXXJ492r9m6GpVV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwkyvPyfLabHj-oHcZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgyQPNk8YJxVPDWfmF94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzj1taxb-H_V_xBQad4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQg2PMnr89YXJdtU54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1Xcb4TotUpjAM8Cp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwwY1SyFMsLfSUz24t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}
]