Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Thank you Geoffrey & Steven, couple of my thoughts on this: 10:20 If govts are ready to use AI for military purposes and regulate companies & people, then all we have to do is make sure that for every lay off in private sector replaced by AI 2 more people must be laid off in the govt sector. Therefore the loss-loss situation will prevent govts from regulating companies or people for their own benefits. Afterall Geoffrey did admit that political systems are heading in the wrong direction so if anything politicians have the worst track record of policy making and must be replaced by AI (Thomas Sowell) 1:05:40 I disagree with Geoffrey on the nature of 'consciousness'. Machines or robots do not have dreams during their sleep like animals & humans do. Consciousness is an inherent state of being reserved for biological/organic compounds not just by functionality, even if we consider the possibility of neuroprosthetics or 3D printing or nanotechnology (inorganic/artificial) the brain cell can be replaced as long as its only a cell. Humans can grow back new cells to replace damaged ones but an entire limb or organ or bone by itself cannot be regenerated unlike other animals like lizards or starfish who can regenerate whole body parts. That also explains why humans are better at interviewing people over AI, as their communication is a form of connection between two different degrees of conscious bodies with limited awareness as opposed to a common unified body without communicative processes.
youtube AI Governance 2025-06-19T16:0…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyPHyuv8GDJZT13ZFl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzFHf193k-1ZKK2gep4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzFItMdv5lNYSiox2V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgypLcJWNFutylrOYtx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzcndRmRRrbOcNACa94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzFkZcoRF8I_ffaYbZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzthcCKu0MI6WKNz-N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzTRSmGkmaRrlDziaZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxiPSOSpvT0aZarlcV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyZtq9zdO7vSlCrVt54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]