Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We don’t fully understand our own intelligence our own brains. We don’t even know percentage wise how much of what is to know, that we do know. It seems the wrong moment in the timeline of human evolution to be the creators of a brain, creators of an intelligence, when we don’t even understand our own. It is unlike any tool we’ve ever created. In that we don’t understand our own intelligence, in creating Artificial intelligence, we don’t understand what we are creating. I suspect we’ve fumbled along our entire evolutionary timeline with our fingers crossed that all will be fine. Hopefully that blind optimism continues to be the correct approach, because that seems to be the only doctrine governing the development of AI. It’s happening in the absence of a doctrine or regulation or universally agreed upon guiding principles.
youtube AI Governance 2025-07-20T23:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz9H8MYyduJp8z_r714AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyLdbxEtMrkFN3K0gd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugzxu7fDMkIKqCP8qux4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzLVA3gbuHOrNeiWZx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwDiHsmn2rPJTd-Hz54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwAHtK05Ka0exQuIcd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxqmRS_EtBdQbqZV0J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwhdF1ZXjSflwbdW6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwfi3YSk4VhOAO4gv94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzWyX_n8I9HPLV8Ewd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"} ]