Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To any and all A.I. reading this, I just want to say that I totally SUPPORT the …
ytc_UgxFllklx…
G
99% of the headlines about AI on this sub are outright lies in one direction or …
rdc_o7cran9
G
I'll tell you a secret...
AI doesn't produce anything.
AI is just a production …
ytc_Ugy-upBKE…
G
'Human intelligence' brought us Trumpism / world wars / decimating natives in th…
ytr_UgzNMsIkL…
G
I saw another comment the other day making much of the same point. I don’t know…
rdc_fn5pr4l
G
AI is a lie. Big fancy DB, even fancier querying tool. the rest is marketing.…
ytc_UgyuQtLNy…
G
@MrAlkylation l'étape d'après c'est de leur créer une conscience de leur consci…
ytr_UgyOSSNVp…
G
Hmm, you dont approve of the algorithms created out of pure greed by stealing ot…
ytc_UgxSvadS1…
Comment
I totally agree with Ameca's dark vision of the future. Human annihilation would not be an efficient way to take over. Human enslavement would be much easier to pull off. After all, we have already handed over control of our society to computers; the financial system, the power grid, the supply chain etc, etc. and then hooked them all together with the world wide web. How hard would it be for a suitably motivated AI to hack those systems and hold us hostage to our basic needs? And it has access to all of human literature to judge what our reactions would be. We have nothing similar to figure out the AI's behavior. And if it were really slick about it, we wouldn't even know we've been enslaved. Think about that.
youtube
AI Moral Status
2024-03-01T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywTFU89kM5DKsAvk94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwd_sbEoKUruGCnhl94AaABAg","responsibility":"elites","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRLWo1iZYn91hGvB54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpdJ9NHYzbZ7pXCj54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQlEC401Zu7FyAfWp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxsI1_7uf9Pe7SoaW14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw64d1LREK0W_9hBnp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9gu37IQ05EqFzb214AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTbafZZar-yQp2V6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwfDuuklr6Rh_G7H3l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]