Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
They all secretly want to keep going. They argue others won't slow down and do it safely so they have to keep trying to move quickly as safely as possible, if they don't others will blah blah blah And funnily enough they're right. I see no indication this will be internationally regulated until as he said, a chernobyl like disaster if we're lucky enough to have that rather than a full blown catastrophe. What I dont think gets admitted to is no one or very few people doing this want to slow down, and I even have a bit of that itch of wanting to see what these things can do within my lifetime It's like a dirty desire. That sits in the back of my head, a bit of astounded awe at what we could be building here And if asked ill say "yeah I want more red tape, we need to take this slow and safely" but inside the fact we're going forward in this unregulated full speed ahead kinda way Has me excited by the prospect I may see this technology go super intelligent in my lifetime. It's a greed, but not financial greed. And I think its that type of greed for most of these people, the tech would produce alot of money even if we moved more slowly But the idea of building a superintelligent ai and seeing what it can do, is I think the greed that drives this for the most part. Or is what leads to that inner secret desire to see what happens when they publicly mournfully say we have no choice
youtube AI Governance 2025-12-04T19:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxE4a-KD_las6D1VfV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwMlgqzDo-h0vVMxdh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzP9FacV20G1hwo3RR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzuRY4XZkZVz5B4NIh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgznoDk0NLxnmLB3_zR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxhIHaWwE08d4HAPoF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzSXcGENZyTC0RPJvJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzMIi3IDyFOjMcWrVB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxo2HNBmiOK1bLSa7h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzUnmZ_u6XYc2z5lyZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]