Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@jingwentang6768 I appreciate your detailed reply. You bring up important points about human roles in defining context, ethics, and meaning. However, I suggest that even these areas may not stay uniquely human over time. If AI systems advance to the point where they can accurately model human values, goals, and emotions, then the processes of problem definition and ethical alignment could also become automated. This wouldn't be because AI feels emotion, but because it can effectively mimic the decision patterns that emotions create. You are correct that value exists only in relation to human experience, but that’s a philosophical truth, not an economic one. The labor market does not reward meaning; it rewards efficiency and output. Once AI can perform all productive tasks, human involvement becomes optional in economic terms, even if it remains important in a cultural or existential sense. So yes, humans may continue to create or express themselves, but these activities would resemble art or leisure more than work. The structure of society would change fundamentally...not because we fail to adjust, but because there’s nothing left that needs us.
youtube AI Governance 2025-10-12T01:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugyr-mnpIEmxS1lNH014AaABAg.AO1L3k0Lf_UAO1LWcJ5S8_","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugxf3PgpfIPzVsibBjt4AaABAg.AO-B1BzFdazAO-bdbozWNg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugxf3PgpfIPzVsibBjt4AaABAg.AO-B1BzFdazAO8pECf1k9N","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugxf3PgpfIPzVsibBjt4AaABAg.AO-B1BzFdazAO9iuDLMhBj","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugwvm5KHXnOMrbaDt5B4AaABAg.ANzjD9IvH_OAOQLye8KS6g","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugwvm5KHXnOMrbaDt5B4AaABAg.ANzjD9IvH_OAOTqXxHIt99","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugzc83LmYxIU71H0qHl4AaABAg.ANomRHTCw-sANpAIaZa7aN","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwacX0QDI0pBVfqaAp4AaABAg.ANi2ubOpxg3ANkENqw2ryY","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgwBTPaLoIbdseVxDqJ4AaABAg.ANe_livHtP-ANehRfapwqS","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugx1repfljJV3utd2pd4AaABAg.ANUj-I5E5nVANUx0JBMaRZ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]