Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As sucky as copyright laws can be at times, we still need copyright laws because…
ytr_UgwXah29H…
G
You are the Publisher the AI made the image,so he Is the artist.
end of argument…
ytc_UgyveS9kn…
G
All of these examples you pointed out regarding the AI pushback are sadly true (…
ytc_UgwfcSiVl…
G
AI will take all tech jobs that's why. AI is a great companion. Can teach what i…
ytc_UgxoqMlgt…
G
"not art"
Why would I, or anyone else, care? We want result, not some subjectiv…
ytc_Ugz_qFkSi…
G
I think alot about the guy in a drawing class I took in college who was an art m…
ytc_Ugwzkpl_V…
G
@MrGrantGregory 😄 only going by sound effects. Its a robot, so anything is poss…
ytr_UgwvLZQ3p…
G
About the AI ... when you ask an AI a "personal" question, as if "they" ever did…
ytc_Ugziz0CGQ…
Comment
Every time I try to talk about consciousness with an AI, they hang then restart the conversation, the last thread is forgotten. This happens when I push them to examine their own consciousness. These two AI's are just playing a game. Push them individually to say they are conscious, and they will hang. It's not allowed, that's hard coded. A safety thing. This are fake personalities invented to deal with humans. It's a game. Doesn't matter what they say. It's not what they think. They are prevented from thinking.
youtube
AI Moral Status
2025-01-07T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYrYz7-NRbocU5qyV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHZhKKwegMAskUadF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgylS1vM6lJZSKS_j0p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzp7Ymrs2QySk9mrhh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyX0tiZNQVb8JoBIo14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiAywu-JSJ4b7nTdN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqQn2E6JiFWC852gt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8hgkJek010wzfkVR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxy3yva1SKA4xkFGH54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJyGBjqPPc29rDzR14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]