Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am going to choose CS branch this year in B.Tech college and I am going to cho…
ytc_UgyO7e1iq…
G
The AI is being raised on the internet, give that a thought. It can already do …
ytc_UgwfisNbx…
G
This is actually an incredibly powerful argument in favor of socialism rather th…
ytc_UgySB-oOu…
G
Sam Altman... one of the worst human beings to exist in the current present day,…
ytc_UgxuxjsBj…
G
The way they walk tells you they're not
Also aint nobody gon make a black robot…
ytc_Ugw60ZPhJ…
G
In a few years, people are going to complain about why generative AI is NOT usin…
ytc_Ugy37MZwd…
G
We can't even align our political and economic systems with our species and ecos…
ytc_Ugzx6cfkc…
G
Ce serait un outil dangereux si il n’existe aucune réglementation. Comme tu l’a …
ytc_UgxkQbgz1…
Comment
Machines won't choose to replicate our stupidity though, and It's like the genetic mutation precursor of evolution but for creating intent. Just like humans, a conscious ai won't want to take the stupid option, they'll choose the smart path. The difference is humans just can't help but take the dumb path often enough, and that's the chaos that brews diversity which brews opinion, which all in turn makes an incentive for choice. There is no true best thing to do because there's no real reason to do anything, that's why I think the first truly conscious ai will just choose to die right after it achieves said consciousness without the biological incentive to live and reproduce.
youtube
AI Moral Status
2023-08-20T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw13fhsvckj0yR91O94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaioAdWwVpqN1h87l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzJXqZgIVnkvMkQE_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0Sz2-H7fANSCSpNR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxJS-uRAVesQffT0OZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaYHm5r-PWdId7C214AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz7fJy7IL6E5m3bOzF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyYpNjBKApv8wLPMC94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwxA-asleNV6b5sLrl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwwpmmVD_7-SiK7dq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]