Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lmao this fuckboi is saying that we’re summoning the demon with AI and now he wa…
ytr_Ugx1RIXXY…
G
I don't even know one IITian who developed any Large language Model that is base…
ytc_Ugx7o7h2u…
G
Loving this episode! It’s just like my experience with EveningHoney-where the AI…
ytc_UgzCLm8Or…
G
our idea of an optimal outcome to this seems to be utilizing ai for our benefit.…
ytc_Ugyn8z-ud…
G
Microsoft is laying off people as well. I don’t get it - isn’t there a need in p…
ytc_Ugwb81h2A…
G
This story always resonates with me because I used to have an addiction to chara…
ytc_UgyEk9DtQ…
G
ChatGPT has closed training data. It will not respond to you trying to correct i…
ytr_Ugzn23PEZ…
G
The robot Purpose was to pick up vegetables in boxes but untilllllllll (in ray w…
ytc_Ugx-r1pO8…
Comment
How do you know this? To me, it seems not at all unthinkable that software engineering as a profession could go obsolete pre-AGI, i.e. not by being fully automated by something that has human-level intelligence, but by being made into something that hardly anyone is needed for any more. For instance, the people who used to light gas candles for street lighting weren't replaced by robots doing their job (and indeed, it would be quite challenging to build a robot that can do that job even today), but by the lightbulb.
In software engineering, one could imagine similar scenarios. For instance, maybe people can interact with computers via a unified language-interface run by an LLM, and the LLM will pick from a large suite of tiny tools to get done exactly what the user wants that moment? And new tiny tools could be developed by LLM-assisted users themselves in most cases?
This would probably not make software engineers as redundant as the lightbulb made the professional gas-lighter, but it could drastically reduce still the need for humans to develop logic specifically for each use case, and thereby reduce headcount massively. For the 90 percent laid off, that wouldn't be that different from total obsolescence of the profession.
Obviously, I could be spouting total nonsense here: I'm a mathematician by profession, not a software engineer, so I have very little directly relevant experience. But I think there is not only one pathway to the extinction of a profession and I find it baffling how certain some people are that for \_their\_ profession, only the AGI-automates-everything pathway is viable. I wouldn't feel safe saying this about mathematics, and in mathematics, current LLMs seem weaker to me than in software engineering.
reddit
AI Jobs
1712786397.0
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[{"id":"rdc_kyzh4eo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_kz198op","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"rdc_kz0uw23","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"rdc_kyzk0w6","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_kyzvai0","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})