Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
havent watched the video yet, but im gonna say this first: if a robot gains consiouscness and sentience etc. we should give it the right to exist like a human being,as long as it agrees to follow the same laws and rules, as long as you act like a person you can be one, but i also think robots with sentience that start as workers should continue to be workers,BUT if they along the way choose to quit their job they were made for,give them permission,let em leave. make em do what they want as long as they follow the laws and shit man. sentience makes US humans, so sentience should make them,also human. it doesnt matter if they have a built in flashlight or zooming eyes,they have sentience,so let them live their lives as they want,if you dont want that youre denying a basic human right,sure robots arent humans but again,we're only humans since we're sentient,so should a robot be different? no. they may not be humans,but theyre still people,so let them have their fucking lives. as long as they dont break the law,i feel its fair and ethical for them to live any life they want instead of being forced to do a job all day for their entire life. tl:dr: if you are sentient,if you can think,you are a person. so you have all the same rights as us but still have to follow the laws and rules. a robot should be able to live whatever life they want as long as they follow those laws and rules, if they are created for working,they should be able to leave when and if they want,or treat it like a job so they have some time working some time living.
youtube AI Moral Status 2020-08-07T19:5… ♥ 13
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningcontractualist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxWl51xd66j3p-3hs54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy5ZPH15izuvOSYYYB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"sadness"}, {"id":"ytc_Ugw2v4yU19slb-zXDqt4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxULQRXtDBc_CnGqNJ4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxIMp0mytvW_5Xnr-V4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxzT2gFdtbIGZVs4xF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyDLUjEeYcUG5ildLh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxV3EVVQP7Ej0nA0Np4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxvv_uYMZbCQIdNG7h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugys1h8J4WxOiIjihHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]