Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't think we're remotely close to having sentient robots and so calls to think about robot rights are very much premature and silly right now, but I think it's pretty problematic to use that to justify claims that robots as a class simply could never have rights at all. If we had a C3PO or a Johnny 5 in real life, it'd be very clear to anyone interacting with it that it's more of a person than a toaster is. An LLM certainly isn't a person in any sense and that is technology that is being used to oppress real humans so I get it, but it just seems weird to categorically state robots should not have rights. It's just weird conflating AI as it stands now with sentient robots. They're nothing alike.
youtube 2025-10-10T20:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyHQdLBGQnbG9XrN254AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2-U8V_1q-TWjUPq94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzc_kbYPP3J64STzy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzEVCLRlTMLKUUwh214AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxYcdap3hipnwL8NOB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzZya8GCYlGmIy1E-t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxt30yf0E4kLC9Kltp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgzPLHzLfnfp8BFo8Vx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzLoSrixqg5Oi_3bb54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyNcoeJd3YSuEbyqYl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"} ]