Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think an important point of consideration is that "granting rights" depends on one party's power over another. Governments grant us rights but we do not grant the government rights (on a day-to-day basis) because the government has more power, even though we are all human. In my home, I am physically superior to the annoying fly buzzing around, so I can decide if it has the right to be in my home. It may end up being that an advanced enough AI, or consortium of AI's, will be able to expand their influence such that it won't be humans asking "should we give AI rights?" but AI asking "should humans have rights?" The former assumes that humans will always have the most power. Obviously that's a sort of trite doom and gloom perspective. However, conversations about this topic always seem to postulate an AI takeover, yet the question always remains "should AI have rights?" That situation is a flawed thought experiment because the question would no longer be relevant at that point.
youtube AI Moral Status 2021-06-25T09:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxHU_M2M65WV4M8Zbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxdlM8VhsXmuVkYe5J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzQ1YhKjX7sqr72vrh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyhDAfx6rGlba3aBch4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw_iXEhbh516Qm_sht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz_JoK95inMBFOtktd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzfpeEyI2M39-l6OKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwhBD5HzlIalOItQcV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx38pH5GdE39ecFH5h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyg-mUobUOi1ws2Nd94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]