Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The answer to this is that it all depends on the kind of autonomous robot. (non-autonomous ones controlled by humans don't count under these new rules) If a robot was designed to emulate a human for instance, it would have to follow human laws and regulations. If a robot was designed to be an appliance, then if course it would have it's consciousness put on a terabyte of always-active RAM.
youtube AI Moral Status 2017-04-11T20:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugi45moqZlF_O3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg2o6yscgin2ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghB5KBq6iE063gCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgjjF29aY6KgdXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UggqgbgzFEkzPngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UggWVXBBY5Xf6XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggOqp3ptdscTHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjD2Dxnb2KMFXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgiK-S247WsWFHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgjAC7DOdN8XMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]