Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Another outcome I can imagine is humanity becoming something akin to crows living in large cities -- crows can make a living in such environments, they may even prefer them, but crows, clever though they are, have _no idea_ how anything around them works, how it was built, _that_ it was built, nor do they have any idea what the hell humans are doing going about like they do; they don't even know that they don't know anything about the human civilization economy, science and engineering, etc. We may end up in an absolutely baffling world that makes extremely little sense to us, that we can nevertheless get by in. And we may only occasionally be "managed" by the AI civilization when and where we are a nuisance. I guess there are worse possible outcomes.
youtube AI Governance 2023-06-28T07:4… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugx3xw4e8ocKyU_ZQVB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwvFU5BKq0WWt53Omp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyKKv9bE8upTa5Sbyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyVKsTelwk9yglYzrV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxPb-0iY4pi7iqejet4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzHCTk_4zxgU8wyWEp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzzypf_v-asdoe7_Nh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxIX-TJyadxzvqSmI94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxfe8sR2whVY9uI4Jd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwETgxNS3mPOdvrjtV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]