Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This video has many issues ....  basically I heard a similar take 30 years ago about "the Internet". Yes there are transition issues, like learning for example what a junior developer should be. Yes, many companies reduced hiring, because the skills set they need is no longer available. Universities and training companies need to adapt curriculums fast. Upskilling existing workforce makes more sense. With or without AI, code and technical debt depends and always depended on (human) software engineers.  AI supported development can be as good as any human code or as best. "Vibe coding" is not software engineering, is code and solution exploration usable in specific situations. AI supported development workflows are not "vibe" coding. Things like spec driven development, and AI supported code review and analyses can shorten dev time and highly reduce bugs by finding more issue earlier in the process. Test coverage can be increased significantly with little investment, reducing technical debt. AI is not replacing devs, is enabling good software engineers write more good code. Quality depends just as much as before of good people. The AI is not the main source for people losing their jobs in IT, especially not highly qualified ones, is more of a scapegoat.  Head count reducing is the "only way" companies in US know to "reduce costs" and make companies look viable for IPOs, mergers and to hike the stock price. Reduce the workforce, hope is not going to explode before you get the next bonus, move on to a new company, repeat.  Now management has a "reason" to reduce headcount. Sure, there are jobs, typical requiring low skill, which are and will became obsolete. This was true 10, 20 or 30 years ago too.
youtube AI Jobs 2026-02-05T18:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxsewdMCXRseYk6-I94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzUnLrgW723InGgGlZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzc9gc_5pXsQbnV4Zt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzsy7Mskbmt8hrlmH94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxbzTW-OceVheB60fZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxre5cd-KcX0wvlsDR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzg3X7qWoEB-AQwxGd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy4d4-XuK17NTRyEKV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz3tWGSLkSJvATQ75V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz1R2KqZKnyJXq3bUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]