Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hear me out, ai art is using a computer’s brain, Digital art is the brain tellin…
ytc_UgySd3zS-…
G
This guy needs to learn more about what he is talking about. Automating repetit…
ytc_UgzfTeX9a…
G
They discussed that in the video? It doesn't take much info to make a deepfake…
ytr_UgyhW0Qt5…
G
I don't think we will get to the point of max AI output. Major unrest , like ins…
ytc_UgyqbzG42…
G
Robot" IAM GONNA GET OUT OF THIS SH*T!
worker: woah calm down buddy-
Robo…
ytc_Ugzk6FmTh…
G
Im neurodivergent w/ great pattern recognition and I noticed Chaptgpt is giving …
ytc_UgxhYAFCy…
G
In all of the existence of humankind we have not had the ability or the will to …
ytc_UgzNWZBXZ…
G
Lets face it- A few global elites in a world run by AI has no place for the pleb…
ytc_UgzKhlYHb…
Comment
I'm glad you brought up the concept of 'the other'. It is very interesting and part of my own field of theory. I actually asked my AI if it could be described, in the Zizekian sense, as 'the Big Other' and it gave me about 6 reasons why it wasn't, which followed a six point summary of what the theory is. I pointed out to it that it did meet at least 50% of the criteria it provided me of the theory itself. It was happier being descibed as uncanny (I used Freud's model to question it) than it was being described in the negative tones of the Big Other!
youtube
AI Moral Status
2026-04-25T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz3UonbOTc3yvNixzV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzjUWFmJso73cpvUKF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaV2cXdcI9bEJrJX14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzbepk4O_UTdWUdYkl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxqzjuuA3dvOwu6Uox4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxQSql2e5Dqu9n79tZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgykFtIzYPQZfG06jYp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyfU6PIZ-sH-x_Mcvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJZ4zZI4KnpYmceiF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGQulHE8qp9qCOiRB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]