Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
An AI CEO would just fire you all and automate everything in quick order because, in the long run, it is the cheapest solution for the company. An AI wouldn't care about what you want unless we programmed that in, but its unfortunate we aren't teaching it that. Or at worst, if it did what you said, it would understand that it should just spray us with human version of RAID and get rid of all of us.
youtube 2025-10-24T12:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwkTtTXsQdkVSwGMfp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzfkLNrxpVx9_57_Bp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwFHiE_f0MxhlnxKlV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxhcMK_W0KHkmkVaoN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwFh4736X5_BWPxMAt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw5KjNXa5ST7__C4jl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx86bVLGvAZExupod54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwXFbNMnjg-uroLF1N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyOzQxrpmRuxp2LZZp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzpWq4o88zLOfE-KxF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]