Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People should no longer depend on (precarious) wages. It is an illusion to belie…
ytc_Ugxh0aVbG…
G
This is all fear-mongering. If you know how all the "AI" tools of today work - t…
ytc_UgxisqPXH…
G
a worse part of ai art is how its been starting to just be allowed on many anime…
ytc_UgzDNmndX…
G
Ironically AI takeover is probably what will end capitalism and start a new comm…
ytc_UgyvWl_LE…
G
I'm an artist. From the most objective standpoint, AI is producing derivative ye…
ytc_UgyPIxXpK…
G
this is not ai ARTIST, artist is a big word. they’re just an AI USER…
ytc_UgzcDh3gA…
G
He didn't say that. He said that's a probability if AI safety won't become a pri…
ytr_Ugx4BqeuM…
G
I think that response to PoE should satisfy you. When we say that we are ignoran…
rdc_cxl6qx7
Comment
"....we want them first." - We? Who's "We", Sam? Who are We? You've just othered a very large portion of the human population and earlier discussed how 'othering' people gives license to vest horrors upon them without being considered psychopathic.
Here's what I think. I think They will not use their autonomous weapons if We do not hold up ours as a threat/deterrence. I think all of us must try to not repeat M.A.D. with AI instead of nukes. I think They will not be inclined to build autonomous weapons if we do not. I think we will not build autonomous weapons if they do not.
youtube
AI Moral Status
2026-04-17T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz3UonbOTc3yvNixzV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzjUWFmJso73cpvUKF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaV2cXdcI9bEJrJX14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzbepk4O_UTdWUdYkl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxqzjuuA3dvOwu6Uox4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxQSql2e5Dqu9n79tZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgykFtIzYPQZfG06jYp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyfU6PIZ-sH-x_Mcvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJZ4zZI4KnpYmceiF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGQulHE8qp9qCOiRB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]