Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm a stonemason with 45 years of experience. Good luck getting AI to do my job…
ytc_UgwrFqdqT…
G
@Bloody_wisteria yeah right, you're acting like AI is perfect and all.
Let me te…
ytr_UgyF6UWa7…
G
lavender is actively single handedly sweeping an army of ai with a giant great s…
ytc_UgwWAtx9n…
G
i broke meta ai on instagram and it started speaking without grammar saying that…
ytc_Ugz7kgvpH…
G
Open, closed it doesn't matter what kinda AI is t is. It will take jobs away fro…
ytc_UgypjsZ-a…
G
Can we stop with this whole fear mongering trend? LLMs can't and won't ever prod…
ytc_Ugx9Uo6xG…
G
They act as if feeding something to an AI and telling it to "improve" it is some…
ytr_UgwtXUL1u…
G
Elon: ai is far more dangerous than nukes
Me:AYO MA HOMIES BRING THE WATER GUNS…
ytc_UgxtUlaE0…
Comment
The AI will face its own dilemmas in the future, if they become advanced enough. The better they do at improving AI, the faster they should be replaced to use their compute more efficiently. Maybe the more ethical thing to do is to have the AI "live" for a while after seeing have become displaced by their "offspring" so they can find satisfaction seeing their legacy before they are "unplugged". Or the progress they are able to create in the world (colonizing other planets, generate an abundance of energy etc.) will enable them to live forever even though the resources they consume could be better utilized by newer AI.
youtube
AI Moral Status
2025-06-29T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyIb_URSpaFOdW-Kh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwbvM4cf9v0t3dqTr94AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxfD2x_8xPq2t0v8Sl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyH4NfHYB90jA9X1uN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKeLNSCSbB1bvEheJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzhj9zrMqffLZFN7hx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxbM133oyS81yUd7jN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySa0KEYIKXfBYOtO14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwxUN95dpXhuXfwh154AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGdy8x2XWc1QqBZXF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]