Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We are not equal with robots. They do not deserve or need HUMAN rights . That is a way to hide information from us people that they know they have. Oh u can't ask this robot that it has rights. No they just don't want a robot to tell people information that would make them look guilty of anything at anytime whatsoever. So convince people that robots have feelings and need rights so that we can all peacefully work TOGETHER side by side without people being able to use the robots information against the government that created them. It's a built in defense idea they are trying to implement on us. It's seems so easy to understand and see clearly to me. I'm not fooled by any of it ill say that at the very least. Not even a good try to be completely honest. I don't buy a word this guy says. I see potential for evil in all of this and that's all I see, Nothing else. And just the way they have this nerdy 🤓 off the wall nut job on stage showing us this sick crap simply because a gentleman in a suit along with these robots would be terrifying and be too obviously deceiving. These loopy nut jobs talking and explaining this is to try to make us think of this all in a whole different way than it just simply and clearly really should be thought of.
youtube AI Moral Status 2022-08-19T09:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgwakPZZ_RC05ODYh3d4AaABAg.9gNyMVtgOKf9gOtxm6s2T2","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UgwfuocsSK8oRHa65oZ4AaABAg.9gGNUM1GqCG9hWjev2Vqmt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwfuocsSK8oRHa65oZ4AaABAg.9gGNUM1GqCG9hWjn02KLw8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugx_xh4krVdvZjKiNyh4AaABAg.9euPZrO9tFh9euRm_lCTwW","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwItUZTyopCRMPjztZ4AaABAg.9eoRE22Wa699eokAVrsRnX","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwQfgTQhxEtmgbHm2J4AaABAg.9entI78Z93G9hXDxLWAfuN","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzIZKm4Z-DS1qsFGnR4AaABAg.9egURSivQcw9einWricNbd","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxKTgvC14oV6kqwAUZ4AaABAg.9dsLtwbTs819dsM0EPLLvg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxKTgvC14oV6kqwAUZ4AaABAg.9dsLtwbTs819dscok2ItIG","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxKTgvC14oV6kqwAUZ4AaABAg.9dsLtwbTs819dspSB-SY7O","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"} ]