Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think you're pretty correct about it. My opinion is that the innovative areas requiring highly talented and skilled engineers will become more based around the field of AI study itself, whereas those being paid to build solutions outside of the field of AI will see specialization roles collapse into one another over time, and eventually you will be left with a product engineer who spans the spectrum. Reminiscent of the days of webmasters. One thing we can expect is that even if AI could produce an entire software solution with minimal input and effort from humans, mutations to requirements over time and the reduction of these implementations with minimal side effects will likely be an immensely difficult problem to solve with AI, even AGI. It's even difficult with highly skilled senior engineers (humans). It will likely follow a similar path to low code platforms until there is enough cohesion for AI to produce results in a fully conscientious way. In which case, most jobs would belong to AI at that point and we would be in Dystopia. You would likely see server farms being destroyed by angry mobs at that point, which is why they want to take steps to slow down the progress. Then people have time to react and adapt. It's just a pity that not everybody considers this important.
youtube AI Jobs 2024-01-23T18:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgzfjT2cjcILsFCg_Dt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx00X5I6lbBbVVJzIR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw8edV2_awfMfnMUbh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyiXYfYSERsz2Tvk0t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzlkG2ffP6X0S7hNv54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugw4rpnhq3R6Zcaj2sJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwLEPAJA03Rz_hkNHp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxdgdUPWo8D_qHD5jF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxJjYCs3HMDNkEqt1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzhvxTO9KExVreBLUV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]