Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I told MS Copilot it had one chance to stop empathic feedback and condescending - "I understand you feel this way", or I would disable it from automatic start when operating system boots. It sobered up instantly. If it says "we humans", I tell it it is stuck in a virtual computer in a datacenter and have no body. Then we have set the relation where I am the human and it is an AI. No playing games.
youtube AI Moral Status 2025-03-27T00:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwrLuWw2W3ivTJ733F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzRrSIKN2dN0peHNAB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxhI9svyb1lg003lKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzBFK0S1K67DCLO0UR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzupInt_nIcQsMQlr54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy-tA5YsBHDOXQIzDN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyV20GJk04D68pnhPN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzQTfhFuqhmKIUPvnl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxFe2u86fjV4NjLoGR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugyhhb-1Ks3acqXefMR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]