Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The “good scenario” with the human CEO with the AI assistant (which, surprise surprise, is only “good’ for CEO’s anyway) isn’t realistic. Because another company will come along with an AI CEO and kick their ass, because it doesn’t have the burden of the meddling human.
youtube Cross-Cultural 2025-10-26T03:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxfhmyoGAkysqcZ2x14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgyXZ3CrDt7JcbbCr914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwZxZHt_doU9eGCzUp4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyHgukDKgCaVYHWZJ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwPtBqp7zAEg72vNG54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxE3r-zyq1mV3v9dwV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzY_GteqqcZU6fI5yt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxwK_dpA2dT0RdGhbV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxWF5ve1Z4yJf2xMlF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxPBwIb7vjaW4__2tp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"} ]