Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Wrong approach. You make this short clip about AI taking jobs and you have no real argument. „It‘s scary“, „I will leave Zoom“, that‘s not a reason for a manager or CEO to turn down „AI workers“. No, this will happen and we all know it. AI, AI controlled robots… they will take many many jobs. The companies will stop at that point where savings are maximized compared to the losses due to people not being able to afford the products (because they are now unemployed or employed at a lower wage). If the companies aren‘t careful, they might even go beyond that point. Or they don‘t care at all, because the impact of that is like 5 to 6 years in the future, a lot of time for the managers to pack up their things and move to the next company with a big big bonus from the old company for all the savings and a big big signing bonus from the new company because of their previous success. This is the capitalism endgame. We all know it, most of us are just in denial about it. The right approach would have been to go with socialism or communism. Yeah, laugh all you want, but there at least the robots work for US, not the rich.
youtube Viral AI Reaction 2025-02-27T12:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugymsido4B5lYtBYakh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzjU_Yc2dq4k8jPSzJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyafMHiodHuWeJTr894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugwugp1B6svfkdAhcuB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy--cJZJikupZYwU1x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxSuXeQnM5dR8Okeoh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw446iUmdXjvra6d614AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwHvDvAzBSu3VhGgZB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgybE1TxCCaz8ZGH4z14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyi0hvKTn5t8vUio7V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"} ]