Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I loved the AI personhood debate! My two cents is that when we give something a right, we do not consider only what it is for them but also what they bring when they join our system of society. Everyone brings value and risks. If a sociopath is born with rights, over time the system figures it out that they are dangerous. And courts take their rights away. With every legal framework, a ton of loopholes open up. AIs are just too fast to figure out the loopholes and scale infinitely, so when an AI goes rouge, their objective would be to kill everyone in the system who could take their rights away. And it could do it. Just the time of debating if we take its right away would be enough. You can put a human in jail to make sure they do not have a negative impact until a decision is made, but you can never be sure if you lock an AI that can replicate itself infinitely. A simpler version is that, if it breathes air to survive, they it can have personhood. Yes, companies are legal entities, but we are debating if AIs can have the same rights as humans who can incorporate businesses. So the point that a corporation does not need oxygen to live is not relevant. AI would compete with the humans and would just create businesses. Or if AI should be compared to companies, then we should give companies rights not to do as their human or AI founders feel like doing, which does not make sense.
youtube 2026-02-09T12:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugwq4mcMNuoUovVZ6R54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxhtVkXymlAA2lkx8p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxnbWc1GD7zQq5WUqd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz4zVe63_XAyOZFTVV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgyiHHmgR28aPtmaYaV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzjCKmySAvKJf3V90d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz-XBDFKFn1-Yg1hnJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyK0OEQYe3jOddslYt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzJ-Hxx91l0jry6KrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZFozoMM1SaSpabeB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]