Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think if you program a robot to think it has feelings, he would not really have feelings. he would just think he has, and the act upon it. but then again humans could be the same
youtube AI Moral Status 2017-02-23T17:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UggvnE_-CErSGngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjwxPmrNXneQXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgiCh_xZkLxZJHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiAVaZPcO_y-3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UggHL_iuYiVHw3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgiACXM3raSp7XgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgiN8rZHH4-XJHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UghT90X0cBS7Y3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugi9Bl1heMcN7HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UghWs4FWrM94KngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"} ]