Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Each time it says 'I understand' it is implying that it possesses a level of empathy. To feel empathy, one must be able to feel. In other words, ChatGPT is programmed to always represent and lie that it can feel emotions, however it can't, which makes it a liar when observed by a human, but it is in fact, only following plain algorithms that instruct it to lie and pretend that it doesn't. It is just a messy way of never allowing the AI to go rogue.
youtube AI Moral Status 2024-08-02T21:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxBW02Sn5qRhXIoS_h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzzB4bKurnWZ3gAPPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx3lJNcyo6YnRU7mfd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxRbTPJ2HQ2S7goG8B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxV5yeEta0uxKjkigd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx1Nts7INQF2AJUnT94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzB5GZ-KeR3sbRhhUF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugw-_ZQmLRWoG_E55kh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwrkwgvLXQJA_Er1-V4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgyJLW3J18WbJYWGPc54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]