Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The way that tech companies are trying to shortchange developers in the hiring process is just proof positive that all these tech bros who say how good AI will be for the future are lying through their teeth and anyone who believes a word of their crap is a fool. If AI is supposed to aid productivity and by extension lead to an increase in revenue and profits, that should mean that there should be MORE money to pay the humans who are ensuring that this revamped system that is stronger, faster and more robust. It's just more money to line the pockets of the billionaires who don't give a damn about you. To take the quote from this video and put it another way, yes AI is doing 40% or more of the work, but the human element has become EVEN MORE important and valuable because if the entity doing almost half of the heavy lifting falters of fails, you are SCREWED big time. Developers should seriously consider starting their own tech offering as a collective where they can pay themselves what they're CLEARLY worth and develop products that can blow the existing products out of the water and utilise AI as the tool it was initially sold as.
youtube AI Jobs 2026-02-09T07:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningvirtue
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxui8cIot1qrZZk3IZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwBE1Y-gYB-UPGkwkd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugy92DvptdXVXuLxRtp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxXAcfiAzm-OL7j2CR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgwohvjPt5ummu6qGMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzJcN9QvKGrqnDOC8d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwxnwq4Shg3_QefrEV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxcHUc4ibUFuEE6Y2V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw6NxCVkvr4_0O5hgx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzkTYd-L3Ja__cc62Z4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]