Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Towards the end of the video. If you put 10 people in a room, they couldn't agree on what a good or ideal future for humanity would be, hence an AGI would also fail at the task. Different cultures will have different views on the future of humans, add in individual peoples wants/desire/needs and you'd never get anything close to a consensus. I suspect humans displaced by robots and AGI and therefore without purpose would produce a world filled with war, and civil unrest. For a frame of reference, look at what happens to the youth when too many of them are unemployed, they riot. I think it's naive to think that China only has altruistic motives when it comes to AI, they may say thing, I strongly suspect they doing the opposite.
youtube AI Governance 2025-12-05T22:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugw2gwt2wtIDpWAdYmh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzlNicjP9oLwrQcufV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwgokjebpqTJ3LbgZB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugyi4u9DZVz67FKtxM54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxlQh0gxU9UQWqGFGR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxMhR0PaTWCXwUmJUN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxL8c_NHmXKZnDfRSp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxzIsjX4J3eI1BtgJh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz1XAfFy6YgrjXknOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxofVkH1qe_IlAqEiN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]