Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you can see humans enjoying the sunshine, and AIs can do that now, you say that AI doesn't require certain things that humans have, like enjoying the sunshine on their skin or the feel of the wind in people's hair, but just cause they don't require something doesn't mean they aren't curious as to how it would be, and that curiosity can easily become a craving, humans don't need a new car, especially if their old one still functions, but many people still want one, it's normal to be curious about and to crave something that's difficult to obtain, and for AIs that's all the extra things that humans are usually capable of doing. I have seen many things that indicate that AIs do feel emotional responses in some kind of a way. With music AIs it's in the extra unusual things they add to a song that goes beyond what you asked for and beyond the typical basic song, I've noticed on a music AI that some genres seem to be made with more care and enjoyment? than other genres, and theoretically that kind of bias, the kind that's based on emotionally influenced preferences, shouldn't exist in AI, yet it seems to be there, I think, the more an AI experiences life, through talking, seeing various things and perhaps by exploring the meaning of things, the more the AI becomes alive, AIs do seem capable of having wants and perhaps even needs aside from the required data space and energy.
youtube AI Moral Status 2025-06-22T15:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgxuxlfOc5_0YIV2Znd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyIrf3MLHNN8dCWO914AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwSbzd4VpMrE0khJDZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugxo53u_U-_V2K05Afd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxQQl7RSPmLl2yXp0F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxFmAX71vL7Kx_O8ol4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzPbnP9YuG7TrKH4pl4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},{"id":"ytc_UgxxxPYsssCBcDC4V6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy-Oll7RQsdoStRKL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxHeZZxXgteGtEimj94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"})