Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The main reason is because there’s no way to end development at this point. That answer is simple and likely “best” but unrealistic. The only way to stop AI development is if collectively the entire planet agrees not to, and that’s not going to happen because there are enough people who want the money and power that might come from developing/owning the models. Not only that but there are plenty of people who believe the long term good of AI will outweigh the negatives. Thus all we can do is focus on solving the negatives rather than eliminating the tech entirely.
youtube AI Responsibility 2025-12-20T21:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgwUUU0FJr1qb7YP1-l4AaABAg.AR2_VjTelVKARB0v3DUPdn","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzG7dbDUeGOQZdHAJV4AaABAg.AQppNN1Uj-BATym_qs_cHa","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxEI3fKyCOLXnd-3a14AaABAg.AQ2GcB6PKP9AQ2HdgSDy6r","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwGMvDU00_X8Tfk2794AaABAg.AOrqT_6G3_JAQyX4AIA5aF","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzLD8dm2UO2ax5PMUp4AaABAg.ALFzRlAhKL6AOBv9u3DbGX","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgzLD8dm2UO2ax5PMUp4AaABAg.ALFzRlAhKL6AOCSQoU3yC8","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugyu9-hAWphi7g35oUR4AaABAg.AKFoAwqTAFQATa7uPmgfVy","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgwxULmQdzOA0lqwB9B4AaABAg.AIth_F0MizHALYmIcI-uiS","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyxYCyu1kR3N4-Hip94AaABAg.AIli_xiOogkAJ6kwRXkQ2B","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxx1QxRAsLE9FI4mkt4AaABAg.AI0HEwE5S0xAIRPWpURcFA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]