Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Thing is why would you program a robot to feel pain to make them work when you could make them feel very happy when doing what you tell them to?
youtube AI Moral Status 2017-10-20T19:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzPEGu4HGHNUfKVL5p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyXDHgqGs3BAdW7QV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxWM3z1SFDAhvfggJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzRnz8y6arWUxgk3pV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyRqu7OqGqzkVLvCP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwAL9THQl5YGNvKej94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UgwxHLorRKIR9x98dfV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz3SDO2ms_3YSL_DbJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzOkWxCifF6GgfExbt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwzWE98yeVe5AJptm54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]