Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The progression of AI assumes all things remain the same. Catastrophes do happen , asteroid meteor event would set back the timelines and abilities Emp event would reset electrical objects Solar activity might cause solar flares to dissolve the AI, energy problems, cooling problems The safety of AI choosing human life over itself Is in question, prime directive of sacredness of life to do no harm has to be addressed, instead of money choices to make extreme profits. AI in effect becomes a god to whom humanity would bow down to instead of God. The real singularity is when God comes to settle all matters, since AI has replaced God in decisions that affect us all. Our problems are often self inflicted, wars, struggle over resources, definitions of morality and ethics are not part of AI programming. The conclusion is that we would trust AI instead of allowing assistance to humans, make all decisions in an omniscience's belief is the real threat. Something created by man can only attain to the level of intelligence and logic that created it. AI can never exceed God.
youtube AI Governance 2025-09-20T23:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugy662Xt-U5-tDjhKBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwfYDOUrDG_cpcFlo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwfNV04gXT_FT6oFTV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgytXRDeTfgRIv_7XMR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz2VNY9qamo1MOiXDJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy7VR1FdpwLvmiYqF14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwqV5Sn586lAAbviQR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw-6wj_L6iczL4_cs94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwRE3_1OHXGiGyxxVt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyDch4wa4qucZ3jNrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]