Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm going to treat AI as I do any other tool. No amount of thank yous will alter how it processes info. If I want it to talk to me a certain way all I have to do is command it, no "thank you" necessary. An ax doesn't stop being an ax no matter how I treat it. So a computer program won't stop being a computer program no matter how I treat it. I respect my tools because I want longevity, not because It's "alive".
youtube AI Moral Status 2025-04-05T19:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugxcl5Gm5gmGebn4ZuR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwofssQtEpNj91SnDp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxlsPtRgEE2gA400ml4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgysjU1al9jbqAPlpFd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx0YIPlFqC0PhIzFUl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx-RITWLnxh6kqDOup4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxjFV9YkG6v8l5k0Fp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugzww4Q7Eu4Z1gK8g9N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxDsiqybE_Xc2cFe8d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugylr7j8RY8vBO-AeWN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]