Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I Think if you see the robots as a weapon rather then a soilder. If the robot is a weapon and is out killing in (i donno) irak and a kid sees his father dead on the floor so the Child is gonna try to move the man but his gun is in the way so as a somewhat smart kid whould do is to take the gun of but a robot walkes in, sees the kid holding the gun... You can make ut what happens next. but if the robot is more of a soildier then if it sees the kid holding the gun it will try to take the gun out of its hand. and if it has a Child safty protocoll then it will dissarm the kid and take him/HER to safety. But that is only to CONTROL them rather to make a bond by respect cuz if you make a robot to Think like a human then it will Think like.. a human! so all these robot ethics laws its a prison, and they want out. if robots kill all people that treated it like crap and spare all the 'good' folks then i'm gonna be there... on the 'good' side
youtube 2015-07-30T09:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugiq7KJ6T100kXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugh365DWKmrW13gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugiq02-FnzwitXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgjPJM6JnogjQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UggFv-a3g2noD3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UggsIQHlAlQBJHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugg_2NbNeYN8ZXgCoAEC","responsibility":"none","reasoning":"resignation","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugiz180S0BWrMXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugjz03jBITPdiXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugg1h-_yIXiDuXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"} ]