It depends on your definition of safe. Most of the code that gets written is pretty simple — basic crud web apps, WP theme customization, simple mobile games… stuff that can easily get written by the current gen of tooling. That already has cost a lot of people a lot of money or jobs outright, and most of them probably haven’t reached their skill limit a as developers.
As the available work increases in complexity, I reckon more will push themselves to take jobs further out of their comfort zone. Previously, the choice was to upskill for the challenge and greater earnings, or stay where you are which is easy and reliable; the current choice is upskill or get a new career. Rather than switch careers to something you have zero experience in. That puts pressure on the moderately higher-skill job market with far fewer people, and they start to upskill to outrun the implosion, which puts pressure on them to move upward, and so on. With even modest productivity gains in the whole industry, it’s not hard for me to envision a world where general software development just isn’t a particularly valuable skill anymore.
Everything in tech is cyclical. AI will be no different. Everyone outsourced, realized the pain and suffering and corrected. AI isn't immune to the same trajectory or mistakes. And as corporations realize that nobody has a clue about how their apps or infra run, you're one breach away from putting a relatively large organization under.
The final kicker in this simple story is that there are many, many narcissistic folks in the C-suite. Do you really think Sam Altman and Co are going to take blame for Billy's shitty vibe coded breach? Yeah right. Welcome to the real world of the enterprise where you still need an actual throat to choke to show your leadership skills.
I absolutely don’t think vibe coding or barely supervised agents will replace coders, like outsourcing claimed to, and in some cases did and still does. And outsourcing absolutely affected the job market. If the whole thing does improve and doesn’t turn out to be too wildly unprofitable to survive, what it will do is allow good quality coders— people who understand what can and can’t go without being heavily scrutinized— to do a lot more work. That is a totally different force than outsourcing, which to some extent, assumed software developers were all basically fungible code monkeys at some level.
There's a lot to unpack here. I agree - outsourcing did affect the job market. You're just seeing the negative (US) side. If anything outsourcing was hugely beneficial to the Indian market where most of those contracts landed. My point was that it was sold as a solution that didn't net the value proposition it claimed. And that is why I've said AI is not immune to being cyclical, just like outsourcing. AI is being sold as worker replacement. It's not even close and if it were then OpenAI, Anthropic and Google would have all replaced a lot of people and wouldn't be allowing you and I to use their tool for $20/month. When it does get that good we will no longer be able to afford using these "enterprise" tools.
With respect to profitability - there's none in sight. When JP Morgan [0] is saying that $650B in annual revenue is needed to make a paltry 10% on investment there is no way any sane financial institution would pump more money into that sunk cost. Yet, here we are building billions of dollars in datacenters for what... Mediocre chat bots? Again these thing don't think. They don't reason. They're massive word graphs being used in clever ways with cute, humanizing descriptions. Are they useful for helping a human parse way more information than we can reason about at once? For sure! But that's not worth trillions in investment and won't yield multiples of the input. In fact I'd argue the AI landscape would be much better off if the dollars stopped flowing because that would mean real research would need to be done in a much more efficient and effective manner. Instead we're paying individual people hundreds of millions of dollars who, and good for them, have no clue or care on what actually happens with AI because: money in the bank. No, AI in it's current form is not profitable, and it's not going to be if we continue down this path. We've literally spent world changing sums of money on models that are used to create art that will displace the original creators well before they will solve any level of useful world problems.
Finally, and to your last point: "...good quality coders...". How long do you think that will be a thing with respect to how this is all unfolding? Am I writing better code (I'm not a programmer by day) with LLMs? Yes and no. Yes when I need to build a visually appealing UI for something. And yes when it comes to a framework. But what I've found is if I don't put all of the right pieces in the right places before I start I end up with an untenable mess into the first couple thousand lines of that code. So if people stop becoming "good quality programmers" then what? These models only get better with better training data and the web will continue to go insular against these IP stealing efforts. The data isn't free, it never has been. And this is why we're now hearing the trope of "world models". A way to ask for trillions more to provide millionths of a penny on the invested dollar.
As the available work increases in complexity, I reckon more will push themselves to take jobs further out of their comfort zone. Previously, the choice was to upskill for the challenge and greater earnings, or stay where you are which is easy and reliable; the current choice is upskill or get a new career. Rather than switch careers to something you have zero experience in. That puts pressure on the moderately higher-skill job market with far fewer people, and they start to upskill to outrun the implosion, which puts pressure on them to move upward, and so on. With even modest productivity gains in the whole industry, it’s not hard for me to envision a world where general software development just isn’t a particularly valuable skill anymore.