Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I do agree that the present ethics question primarily lies with how AI impacts humans, but the idea that robots will never be deserving of rights does not sit well with me. At the rate we are moving, we are likely to have sentient robots in my lifetime. Something that can speak my language, may be more intelligent than me, has a sentient mind and can be exploited should be granted some consideration. It is good that we consider these questions now so that we know how best to respond later. Yes, humans will always be the priority in this equation, but that doesn’t necessitate excluding robots from the ethics question. “Never” is not a good word to apply to ethical queries.
youtube 2025-09-17T11:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyliability
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxBj3x3fexkqMBB7rF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw6xrrd4pQjuRRKUaJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwPjOF4pCjxDJs9MbJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugxt0rcPiqk3meRbOU14AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxRsXTuKzpkxRReh-d4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"disapproval"}, {"id":"ytc_Ugy7xkEgfk0Y5S_Ecbh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxKGWBEHrm3URaGCXl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugw2MkU3h-SWESpz40d4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxQDL_rogUePHjHnc94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugxf3PGO1q2CBhyQ5GB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"} ]