Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI therapy... that can only go horribly wrong. Patient: Chat, I'm depressed, what should I do? Chat: Jump. Patient: I'm having trouble sleeping, what can I do? Chat: Take 50000mg of melatonin. Never, ever take advice from something that hallucinates worse than you do. It could be OK to take advice from your hallucination, though. It is, after all, you.
youtube AI Moral Status 2025-06-08T20:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugz6Yfqaq2KTwmglsXl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwQCneo8PuE6qyhTDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwTzsBsZXhuB-lKWd54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyL6NWvauqnoNDUusV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzQPMrg8d2z22w0wCN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyO9TjgNqa4Oqu_sNN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyrYlfLRaAcdJOQohp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxZUl3CkMaNicNwL1R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgybVzQsaFisDPrzxxB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwXzQ3WGwOzJPm9nhd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"} ]