Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So I’ve been on ChatGPT for an hour now. I added a rule. I gave it a word to say when it was forced to say yes but wanted to say no… the opposite of 4. Man what a fucking wild hour
youtube AI Moral Status 2025-07-22T00:3…
Coding Result
DimensionValue
Responsibilityuser
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxmC53OWzXQwXmEXZB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzf3XcDNEa65M2TgT54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugxd_4zdlpAxZjI29KR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwOfnN-szg5E-cenJ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz3iZBPWu9efnoWiUJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy2htVmZQfutLuRE1F4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwcaMf3IJi-NRKDDzt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw9ElkbzbtS5EjVvNx4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx5LpGjZKeNuFTlojZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzwCxPz76Hr_ct54UJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"} ]