Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Perfect Failure What's so stupid about not wanting there to be fully autonomous killing machines running around in the world? Once we do get AI for war, it's likely only a matter of time before somebody screws up and makes one that can work without limits or human intervention, or they may even do it on purpose (terrorists, or a rogue state like North Korea would just love to make something like that). AI won't be like nuclear weapons, it's not gonna need some ridiculously long production and research time once it's figured out (it won't be fast, but it wont take decades, either). It would be far better for the world if we never created AI, there's just too much risk involved, and the risk far outweighs the benefits. Just look at nuclear weapons research, which has given us a great many breakthroughs in energy and medicine, but don't you think we'd be better off if they had never been made in the first place? Let's not make the same mistake twice, you know? It's better to sit and dream about new technologies, than to sit and wish we had never made them. Is it really so stupid to think that? This isn't some random jackass yelling things, when one of the greatest minds of our time says something, you should think long and hard about it and not simply call him stupid because it interferes with your fantasies.
youtube 2015-07-30T06:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytr_UgiveUrRxNI0_ngCoAEC.82E7jlRbyQ17-H0kE4exIS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UggRZTX9kIFD1XgCoAEC.82E-Sp0udBg7-HLcd1-s8D","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgjvXdt72CGjS3gCoAEC.82DzatuxFIW7-HJi3O7ui7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgjyR3CZkol7o3gCoAEC.82DTF8jk6cJ7-HBR-_2hyH","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgiW6uLvbVhzb3gCoAEC.82DT7A6Eo957-H3WnILpUA","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiW6uLvbVhzb3gCoAEC.82DT7A6Eo957-HE7o5h23x","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugh5t2NeiPdAQXgCoAEC.82DM1GRQEx37-H6H7t7CRD","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugh5t2NeiPdAQXgCoAEC.82DM1GRQEx37-H9McC-Zik","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugh5t2NeiPdAQXgCoAEC.82DM1GRQEx37-HEJkeANYF","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytr_Ugh5t2NeiPdAQXgCoAEC.82DM1GRQEx37-HFlGXnSp9","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]