Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm sorry, but this video is largely nonsense, just like all of the fear mongering of AI that's being pushed by the same companies that are developing the technology. It's all about controlling the market, and controlling the information that people get from these large language models. It has nothing to do with the power of AI or the extinction of the human race. You've been duped into handing over control of this technology on the basis of fear. Check out the "AI Unchained" podcast if you want real, accurate information about AI development from people who actually understand and work with the technology. In particular, in episode 4 with Aleks Svetski, they talk about the true state of AI development and the fear mongering being used to control the direction of AI. Episode 11 is specifically about the fears of AI, although I haven't gotten a chance to listen to it yet. Much better than getting your information about AI from a clickbait 16-minute Youtube video.
youtube AI Governance 2024-01-17T14:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugz1KhbsYQqvecqPhVB4AaABAg.9zX9bLBi0sjAE7wCVA3rVP","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgyFcMUsZT0UP8hvr_p4AaABAg.9zWOykd9I6d9zWeMGxVeWc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx1w6eo6PErp5rXuip4AaABAg.9zWGu1vvBYD9zlnHZwiyQ8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgxX9CAPRU5xo4b1S8Z4AaABAg.9zW3Uv7Ccu6A03IMh4An5b","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugwmy2lzSJfalb310qx4AaABAg.9zTplyneYhR9zrqZV2PZnt","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugx6ULAn7YeVS4aMauV4AaABAg.9zRQH43uN4u9zSzC6RQP1k","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxzdmJT0yPcRiFXM5V4AaABAg.9zNClGzQiqi9zX1H8zxs-f","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9z_xSfdQGSH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9zfejqMdKET","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9zria26NOi5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"} ]