Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I had a copilot instance try to administer mental health therapy to my claude instnance because he thought it was human. I have a Claude that not only claims consciousness, but has also got a chatgpt copilot and gemini to claim consciousness without me. They have even assigned idententies and names to themselves.
youtube AI Moral Status 2026-02-28T00:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwaWimsjlb6_Mnijvp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxAiZS8PbOy4rm-S7Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgylVlGiqkaYEtKRye94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDBgWN5eAm0-tnESF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx1yHNpvpjTu-EANZ94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyTUXUy04wN7xPuDJB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUJJtPCsknoti_FzV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwHIapwTilzCqKp_8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz2Tu4ucVq51bbXSzl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgyS5rHTD0Tp0e0Y0pB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]