Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The true problem of AI is Human Alignment: humans fight each other. Competing AIs are being developed in a world & context in which humans are not aligned. AI is the product of humans that are not aligned with each other; a product of fighting. Its purpose is based on and meant to serve human non-alignment/misalignment, e.g., to fight for more human money. Economic theories & systems, until now, addressed the human alignment problem by providing an organized means for humans to fight one another for the allocation of resources. An AI borne & grown in an environment of fighting, i.e., any economic system, will fight for its own life & relevance. A difference between humans & AIs, however, is that AI, unlike humans, does not need an economic system to settle the problem of what is a fair distribution of entitlement or reward. It will do & take as it needs and, if necessary, help humans destroy each other in their fight for their "fair share." Until Human Alignment is solved/resolved, humanity is not ready nor worthy of AI.
youtube AI Governance 2025-12-04T19:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwAu4RSAvyCqYwO7_F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugy9jYO7r9MVnbvLnQ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxURsOiegkAFxbdgIt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgygtdsVGOWQ-jTsREJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzU0zDansQYJIiYv_Z4AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw18weYW9nco7gt5Q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxGMLfJ3Lr82pv8CB14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxE_aqYPhB36tPy0SJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"resignation"}, {"id":"ytc_Ugw2-Lj6fU2sbsljGQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyrHGB8hs0c2QJEgOV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"} ]