Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
At least you know that it's ai when you use it for therapy. What's really fucked up is when crisis hotlines use ai instead of trained human beings. Because using a shitass ai when someone is on the brink of death is a great idea!
youtube AI Moral Status 2025-06-14T14:0… ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzlS_yPIPsEi1eXVqZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy86YkqgPITjO_Wby54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxXdMklh9twEklg9JZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy5IW1qFQuzyZMeHYd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwZnQxGoxZKXodjwph4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxXeu8ukme9F4hPI-h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgytATL7Dvuh_R5SvQV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwCpgdOeXcYixrioeR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxk2CmlZBNDHFHMmA14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzF42IIG5Ve5E5CDgV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"} ]