Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think if you give rights to conscious robots it may increase the chance for robotic takeover. Not a conspiracy theory. If they refuse to do stuff because they consider it dangerous to themselves (seen a video about a tiny robot refusing to roll down from a table because it considered "falling off" to be dangerous to itself), they may refuse to stop this and that. Or they may refuse to do this and that. And if you give them rights they will refer to them, like "Hey, I have the rights to do this", you would't be able to do anything with them, and they might consider YOU as a threat to their presence and they may refuse to follow the laws of robotics, because they consider you harmful. Just a sidenote... ^Hex
youtube AI Moral Status 2017-03-24T20:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgxOz32Mqir4mx9Q7ep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxHnrQw-5aECr2Zvt54AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwr60lU1uhM2pDe5bp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxiDEONyjhXpZiAX214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzSJnuzoFnHlr7OVHB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UghQiW0rduVSOXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugh6hVu_9ssjf3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugh91X5m-k7M6XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UghfBFxixrIHDXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgjPKZV0GM_N-HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]