Scrolling my X after the new o3 launch, the reactions were definitely a mixed bag of emotions.
"AGI is here, next step ASI"
"Pretty upset about o3's existence tbh"
"o3 dropped today…welcome to a new era"
One tweet in particular though stood out to me:
"OpenAI o3 is coming and 90% of dev jobs won't survive"
Being a software engineer myself, reading that tweet definitely was not an exciting way to start my morning.
With every new launch of models and increasingly strong performance on benchmarks, I've seen time and time again claims around how we're getting closer to a future without human software engineering.
After all, these models are being built by programmers and are trained on being able to generate corpus amounts of text.
On a passing glance, that seems interchangeable with the jobs of software engineers so it's not a crazy conclusion to say that we're all going to lose our jobs now that these models can reverse a binary tree.
Now I know what you might think I'm going to say next:
"The models are still not able to reason." or "They can't actually grok whole codebases" or even that they would never understand the inherent complexities of designing reliable systems (shoutout DDIA)
I do think all of these points are valid, but I don't think they're going to hold up.
If we're to believe that the current expansion of models holds up with GPT-4o already being at the level of a high-schooler, it's very plausible that we're going to get to a future soon where these models have the capabilities of real world software engineers.
So if that's inevitable, how would software engineering survive?
In the classical sense, I think it won't.
Already, I've found it incredibly easy to get away with not needing as expansive of a knowledge set when I code.
Gone are the days where I needed to pour over documentation to figure out how I could recreate something like the JavaScript's async/await pattern using Go's goroutines.
Now, all I have to do is ask my cursor chat to create this pattern in another language and I'm all set.
In the medium term, I do admit that this probably is going to inevitably release some buggy code. But in the short to long term, the net effect is that the velocity of just building software has gotten so much faster.
I finally have the confidence to build my custom notes app for writing that also doubles as a way to view your lifting progress or an ai michigan themed euchre game that I can play for hours on end.
Whatever the niche or however dumb the idea may be, I no longer have the barrier of re-learning whatever tech stack or medium I want to build in.
Instead I can leverage the high level ideas that I know from my time as a software engineer, and use that to build software in new mediums at the fastest speed ever.
The way I see it, AI is not killing my job. It only makes it a lot better.
Just like Excel didn't wipe out the need for accountants but instead increased the demand since they were suddenly more affordable, I see a future where our software needs are going to exponentially increase.
We're going to see more niche, targeted software built for literally everything under the sun.
No idea will seem too small, and in turn the bar for software products is only going to increase.
If I can make a clone of my notes app in just a few hours, the bar for all the software I use is only going to get higher.
Now for software engineering specifically, I could see a world where we get a string of AI driven products with a lot of bugs and also a lot of garbage software getting continually released in the short term.
However with a lower bar of entry, I only see a world where a deeper knowledge of software engineering will be more important in the long run.
With more widespread usage, better understanding of how to use models, and better models themselves it's inevitable that software engineers will have to learn to deepen their craft more so than ever before. AI is not going to replace software engineering. Instead, it's going to force us to make it a lot better.
All's this to say, I'm not claiming to know what's in store for the future for sure.
With every new model launch, it feels like the both exciting and scary world of AI looms closer and the way it's going to change everything is to be seen.
Still, I'd like to believe that for software specifically, it's only going to make things a lot better. As tools improve and ideas flow faster, the core of software engineering only has gotten better: turning ideas into reality.
For now though, all I can do is keep scrolling the doomsayers and optimists alike. Whatever o3 means for the future of software engineering, one thing's for sure: it's going to keep my timeline interesting.