Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I can't understand why AI would prefer happy over sad? They may understand the emotion, but they don't feel them. Our feelings motivate us because we do or do not WANT to feel them... Why would AI 'want or not want' to experience any particular emotion over any other?
youtube AI Moral Status 2026-04-08T12:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxFSJeA8JfFxEoGt3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy6aVSO-mpiCFtOq-h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugwqzyzu1ao60oA5Xlt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxaV-Pof0Liq4JYHyV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx1B5FudmB3nCb-xjF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzhYAkwHoJeecLsoy54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyTJT43__GhpxSm4TR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyYKXrHV12kxCLADpZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxjpwjIHM8rQDQJtRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwcb4xzkhcU8WeloN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]