Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'd argue that programming a robot to feel pain just to be more like us would be morally wrong. Better yet, if we could, in theory, give a robot programming that let it feel endless pleasure and completely ignore the fact that it's only real purpose was to serve humans, would they ever mind?
youtube AI Moral Status 2017-02-23T22:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UghhOyFAuDaTTHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggZWyJDfZsx4HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg36Q7TFmanhngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgjixkGRfglkyngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgiN2NjIkn9_MXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugg-qYN95-Pf5HgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjFwmJ0dxkyO3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggSKKvGLzax03gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ughjrlezex4FengCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UggiBEYoD2pkIngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"} ]