Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
No, I'm good. Despite selecting google assistant on my phone, Gemini turned itself on without permission. I told Gemini that I didn't ask for it, and it had the unmitigated gall to act offended, try to change my mind, and then after I told it that it was awfully presumptuous as a bot to think it has any right to try to change my mind, it claimed that I can't hurt its feelings because it doesn't have them. So no I will not be polite to that garbage.
youtube AI Moral Status 2025-05-19T19:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxTaYANFMaHPjxSb694AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxfwdEwPBP8n9UTk914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyHNPn1Xjbgtu-3pUZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxH70vuST7_4H8AxuV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwZkmPJJPvk5QZEkxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwkZKZ7mEd9xc-c-KV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwz6i-wcz8n0UnsqmF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwKKwm3gX-Q0R0gr4x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw60ewEMUP-mKFn4Ol4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgytRuKRkA4AgR-Dmf54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]