Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The likelihood of humanity being conquered by our own inventions even AI is extremely unlikely even in the far future i'd say, (we're stupid but we're not that stupid) there are simply far to many variables to consider when eradicating mankind. even if an AI gained access to every drone/satellite on earth, as long as we had control over manufacturing it would be a war of attrition that we'd win in the end. i can definitely see AI controlled war machines becoming the next frontline war fighters however. but would that cause WW3? that can still only happen if nuclear deterrence is rendered useless. which for the time being it can't. a common mistake people believe is that nuclear missiles are useful... they aren't, practically any super power can counter it easily. the real weapons are nuclear stealth bombers dropping cold bombs from extreme atmospheric altitude that cannot be heat tracked. and once it's out of the cargo bay nothing can detect or stop it. so unless humanity develops a weapons system that can track and take down a GBU or a MOAB, nuclear deterrence is still a thing. we'd bloody invent an artillery system capable of lobbing nuclear shells if we had to.
youtube 2018-04-04T01:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyM3zIQoMSYH_X15Tx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy961k_AinFEquAaxJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxxUtZNFyNeZ3z3qlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw_MLv1pcRHLOgVJf54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwqj1F3WZox5LNUAF14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyTwD3sBU7G3mcat-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzL4rfeRN4o6QcQQM94AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugzo2JkbVgM_-Wk-5G54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"resignation"}, {"id":"ytc_Ugx7qrgN7od16Lv9fv54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugxag3HFEFXZs-RBDLd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"})