So instead of speculating on whether or not the jobs disappear, I'd like to open a discussion on what the next step would be assuming they do disappear. This can either be personal or what you think is the most logical transition. Additionally, we're assuming worst case scenario here: you're not getting another standard software position (you may interpret this as you wish).
If a company can now move faster because code gets deployed faster: what are companies gonna do with the gained time? Well, develop more features, right? And probably they will find out that now they need to deliver really complex features to beat their competitors. Perhaps we soon find that while AI can generate code just fine, they cannot yet perhaps generate truly complex systems? (i.e., AI cannot generate code that doesn’t resemble a bit their training material… so basically you don’t know what you don’t know).
I think if software developer jobs are taken over by AI, software developers will still be employable doing something related to software development. Perhaps not writing “raw code” anymore, but definitely using AI to meet customer needs. If any, AI will make companies in need of more developers (we’ll need to find another name for ourselves) I think. 50 years code, code wasn’t what companies needed, they wanted to solve business problems… but it turns out you need code for that. AI doesn’t change that, I think: we may not write code anymore and use AI to solve business needs, but we still need people to operate that AI.
AGI is a different thing, though. But then I don’t think we are close to that.
assembly -> C
C -> python
With some people still around to confirm the lower level parts are working as expected. I've already moved more to the describing the technical solution rather than implementing it. Hopefully this puts me in a good place.
Why will programming still be our pivot? There is no perfect software. Every program has resource constraints and unique features. High-level constraints include choosing app submodules to please users. Low-level constraints also exist. Who can decide to cut off garbage collector for performance? Yes, the process in some aspects may become more declarative. But only you know your constraints and trade-offs. And you need to specify these constraints for a hypothetical perfect coding AI. The deeper you start to specify, the more you start programming. Maybe you'll even use "while" constructions in natural language programming.
If we imagine AI knowing all constraints and human needs, we enter sci-fi territory. I think we could collectively write a pretty decent sci-fi novel about this in the comments.
But in the end, the world is very complicated and this is only an opinion.
P.S. Right now I'm going to write code in a completely non-declarative style, and have fun doing it!
The code itself is an artifact of the process of theory building while solving problems using computers. As time goes on, and new wrinkles of the problem space are found, the programmer has to manage the process of revising and extending the theory to accommodate all new observations.
Knowing the keywords, APIs, and typing valid code are a hard set of skills to teach an LLM, it's taken billions of dollars of research to get close. It's still a hill to climb for the rest.
This is the difference between "computer programming" and "software engineering". Engineering is using it to solve real problems.
I absolutely detest these large frameworks and thus absolutely refuse to go back to fullstack development. I might, however, pivot back to fullstack development only after AI replaces all the unoriginal framework nonsense.
In the meantime I have already pivoted to proxy and API management. I also have a part time job as a senior technology principal in government with associate director experience, which can become a full time job if I want to relocate.
Let's put working on a different level of abstraction aside (like reviewing outputs and guiding AI) - it's already happening to some extent and developer jobs are always changing with or without AI.
If AI gets good enough to largely remove the need for software development jobs, I don't think there's a pivot, at least not at scale.
1. If AI gets good enough at SWE, it will very likely also be extremely good at any kind of knowledge work, so you can't go and be an accountant or a lawyer because they are losing their jobs at scale too.
2. An individual can go and be a plumber or an electrician but it can't be done at scale. If 40% of workers are doing knowledge work, they can't all switch to manual labor, there isn't enough demand in those fields to absorb 40% of people. (This is even if we ignore other problems like the fact that not everyone has the aptitude or ability to retrain)
3. Even if you individually are okay because you're still employed, or because you're independently wealthy, you still have a huge problem - at 30%+ levels of unemployment the economy and society as a whole begin to collapse. If you have a job, there wouldn't be enough people to pay for your services; if you have assets (stocks, property, currency) their value won't be preserved in an unstable society with high unemployment. It wouldn't be just a problem for the people who are directly affected, it would be everyone's problem.
Assuming AI automates all that too, my degree was in EE and someone has to do the low level programming and wiring. If AI takes that, I'll just get a job connecting the AI brains to the nuclear fusion plants.
Or I'll just become a chess player. No matter how good AI plays chess, people still pay to watch humans play chess on YouTube.
That being said, in this scenario I think there will be lots of work that may look something maybe a bit more like SRE/devops does today, that is probably involving a lot of security/monitoring and code review/iteration on patches.
If AI progress continues and does not get misused, we end up with a utopia where we don't have to work. Otherwise, you have to be prepared for the doomer scenario. How you prepare is up to you.
There won't be any in between situation unless we hit a major roadblock somewhere - an unsolvable problem?
Doomer Situation: https://news.ycombinator.com/item?id=35364833
This is a tech enthusiast community so I don't expect people taking a doomer stance unless it hits them personally. A messup could be devastating in the future.
By that analogy, software developers will use AI to solve technical problems.
A role like “tech lead” today is often about negotiating with other stakeholders, writing and vetting a design, and chunking it up for junior devs. That work stays relevant even if the junior devs are replaced with AI.
That being said, I don't think AI will become "smarter" than humans in knowing what we don't know which is especially important when it comes to execution of ideas, so there is always gonna be "high-skilled" labor that can't be easily automated when it comes to being creative and innovating.
As someone working in FAANG seeing how incredibly low-labor the SWE work is though, it's hard not to see this still as a golden era that will be looked back on in decades.
- Set standards to follow
- Setup linters, formatters, etc
- If inputs will still come from humans then how to test and ensure edge cases around those are fine (often missed by AI)
- Impacts on cost, performance, memory usage and other requirements
- Be able to manually intervene and correct code if needed. Any automation will likely have places it gets stuck in
- Provide feedback so it improves
Consider how there were farms and farmers. Manual farmers back in the days were a lot different to the farmers of today using machinery. Same with factories. It's not going to be a 101% automated.