Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Best case scenario is that we automate literally everything so well enough that we can support a large population of people just slacking off. How do we get there? Hope the people in control of automation aren’t greedy.
youtube AI Harm Incident 2024-07-29T00:2… ♥ 3
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugxm7ENojjvkF12DvLB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyZMhLpDCn4D5jSUZt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxHPTbCD7wyErFM7JN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwUKSeAiYp0lRJ5jqh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxKlPjNG7aKu82glWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzDtVKFLRUw7SZ4DbR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx2rbKNDHKC-hUyz_t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy7_X9JdCJW5fuRos94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgytNdnkuR-ZnmMgq9Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxU9dPXuKcGOp-LgB14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]