Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't see the reason to complicate the matter, its moot to think about giving robots rights when we don't even know if some day they will need it. so if sentient robots happen in the future and ask for robot rights ? well, we can start by not being assholes for once and give it to them. that simple...
youtube AI Moral Status 2017-02-23T19:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgjPqjMt-pkLvngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugi9SRakL2Bf_HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjPwLSLOlA_KHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgiPbWKgGNLcRXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Uggvb1v6xsKYq3gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_UggZ2V9k50b0dXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgiqvUOCuxWU63gCoAEC","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugj7P2k1eqTkb3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgjXM-FDR0ab8HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugj_Bi6c1REBTngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"} ]