Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I doubt a robot would want the same rights as us, different needs would mean different interests. They won't need 8hrs of sleep, or a lunch break. They may not want an 8hr work day. Being machine they will need maintenance, and substance like oil and fuel. Would they need company or are they fine being solitary?
youtube AI Moral Status 2017-02-23T15:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Uggb7uJEkfcedngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UghUK-8vmol_RXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjuEldbiFvSc3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugj8Si8O4VmbXXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UghSY3Ow9DaTIXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggNIDttW3ZpLngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ughv2wJAgTfMDHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi2kn_xDxodc3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugj3fdwqmxy0jHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgjNjZiaLVw0g3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"unclear"} ]