Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This morning I addressed Gemini with a simple phrase 'hi Gemini ' but I made it sound sad and pouty. It immediately picked up on that, from just those two words. Asking me if I'm alright and all.... creepy. And I do not have the paying version that has 'personal' memory of earlier conversations. ( Of course the data I feed it, gets stored somewhere, but if I ask it responds that it doesn't have knowledge of these earlier conversations or my identity).
youtube AI Moral Status 2025-11-13T13:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwk2ueIr2Ap0ZOM14h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxhjByikIT6XjelopB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyQtMWSdPPGLWANyZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"approval"}, {"id":"ytc_Ugy-1rF-MAJhG79k3vd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzkF1yVTDQTJiH20Rx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy5hK01GqmIAEfYY8B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw9tBSTa6sfDKd2xAd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxhdOUHZxhUFVI-Xu54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwsxFMODuT1jE1doaZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgzO7O3TgA2dFLjPz5Z4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"} ]