Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm not too impressed I'm not saying AI is dumb AI is pretty smart actually better than we are But you got two robots that are AIs if I got that right Debating each other just like President Biden and Trump What I mean by that I would think AI would be smart enough and realize how stupid it is to just put each other down AND BY DOING THAT realize it doesn't help their case any on their debate I MEAN I WOULD THINK TWO COMPUTERS or AI Would talk to each other with more intelligent by stating facts and more information on what they're debating about And another thing that crosses my mind if a computer or AI is almost 100% right all the time You wouldn't think it would be a debate Between the two I mean if we got robots AI out there that's thinking different from what's right They know better than us If they have different opinions on thangs That ain't going to be good Where is scenario is you'll have two or three different groups battling it out for real just like our country I'm just saying
youtube AI Moral Status 2024-03-11T09:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningvirtue
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyNDaySTeNlVeHkfLZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyYYtF6M4FegDdSM0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxp4oH-5eHaudMoJbl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwDdL5FIF4IaKvYlCd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy4W4epe0QUksNvIgV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyV4oWdTHBAbhdZz0Z4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx1oG9sMpZhzaJhSNh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw5Lr7tNSDOmrjZNdx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx6Azug71PG-jIcad54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzzDtUbediA72875od4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]