Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I was hoping for an interesting take as I personally also don't think that we need to worry that we will be replaced. At least not completely, or in the way that people think. But your reasoning was... naive is probably the right word? Like your time frame for when artificial general intelligence may be achieved just seems uninformed to me. My personal view is that it is actually very likely that some firms will hire one or two prompt engineers who are also software engineers who will prompt and oversee AGI based agents, in place of some teams. Unless you have some spiritual reason or something biasing your reasoning here I dont see how you can think humans are THAT special, so special that it will take generations before we even get close to AGI. Just 4 years ago most people in the industry laughed at the idea of having an AI based assistant. We absolutely did not think we'd all ask a chat bot to write code for us from time to time, by asking it in pure english, no less. Anyway, my personal view on why traditional software engineers aren't going away any time soon is that we as humans like when other humans do things. It just feels good. It is built into us to like what other people produce. And the keyword there is *people*. That will not go away. Just like artists will never go away, or musicians, etc. We will always crave that as human beeings. That coupled with the fact that most companies in the world move at a slow pace. Most aren't cutting edge, even if they want to be. Mostly because they are risk averse. They want to see something working for a long time before they switch to that. I'm glad you're back here on YouTube, but I do have to say, this video was a tiny bit disappointing. But still, interesting to hear what you think, since we agree on the what, even if we don't agree on the why.
youtube AI Jobs 2024-04-06T21:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyLhUmai4Y9QPx_3mN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz6TRadP1XLXKAupTF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxQS84kLJ1NHETGqt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugy-lzw9BASP62IQnpd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw5ZbFITvEnAji39SV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzctZ7b5Y0614U3jr94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugy4JbK8DofZXzzyW9J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugzem888exCddNp_DNh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgyzhZNFlM98OMUGAJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyU6QKH_V2rPL060Bd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]