Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
But right now you are forcing reasoning embedding more information which translate to huge power demand. LLM by itself, cannot be AGI.
youtube AI Responsibility 2025-10-07T18:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgwBJ8d5HzVCJx0mN0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx7K7WASO7TcM6rSlR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy5nh2nsfCJ_vnTldB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugz5PvwcD0kXa2s67X94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgyVDIg6F2mNQvgXhud4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugzb6vSE0qAiwk3kE2J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugxyex93TJQFLVTNqSZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgyPItc0V9Yv9towm0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyJKd_uyXubt-kaMY54AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"resignation"},{"id":"ytc_Ugx0e5UFEoGArshWtMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]