Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The second step is to censor AI tools and information from the internet with sev…
ytc_Ugy0uduG2…
G
A potentially rogue/sentient AI with an IQ of 20 could still be extremely dange…
ytc_Ugxsh2-y7…
G
The title of this is misleading, propaganda, now consider a child who makes mi…
ytc_UgxaO-6Hr…
G
Where we are with AI compared to AI becoming self aware, we’re just barely scrat…
ytc_Ugz3gTpoo…
G
Since we're on the talk of A.I., I'm curious to know who will be PAYING TAXES if…
ytc_UgxW4UzJj…
G
honestly there is nothing wrong with using the pose and composotion as long as y…
ytc_Ugy9DiITN…
G
I can only hope that this will stop some people from shaming victims of real rap…
ytc_UgwBuMHx-…
G
As a generally pro-Capitalism person, I agree a lot with Bernie here. AI + robo…
ytc_Ugxpa1mkB…
Comment
I think an artificial superintelligence can't destroy humanity on it's own. Mostly because right now the energy bill for the AI is so enormous that the owners will just cut it off if it isn't acting in their interest. The AI can't build it's own energy supply yet. But I think the AI is still dangerous and can kill humanity, because I very much think that some people are either oblivious to the extinction of humanity, or really just don't care. And they will do everything in their power to keep the AI alive and eventually destroy current civilization. I just don't think humanity will get extinct, we will just get back to medieval times.
youtube
AI Moral Status
2025-10-30T21:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjxE3ed0-cXL54FoN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZV9HVtUByR0zeelx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyKzqdR2kM7HQ3gO1t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyIognEwomLuypLOcB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxszYggu5E0cMVPBa14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJGvcDzlxp7A9ZQEl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5MfxipwIc8Coqa-N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-8lEQo8xo8Ulw5Z94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmPJ7kHypbtvukkSp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1Jp7u5tsO91sycdh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]