I've been watching the AI narrative closely. Building with it. Learning in public. Talking to developers, founders, and regular people trying to figure out what's real and what's noise.
And I keep running into the same story, told the same way, by the same people:
"AI is going to replace developers. No one will have a job. AGI is around the corner."
And every single time, the person saying it is trying to raise money.
Let's talk about what's actually happening
OpenAI, Anthropic, and the other big labs are in an arms race. Not just for talent or compute. For capital. We're talking about rounds measured in billions. And to justify those valuations, they need a story that's big enough.
So what's the biggest story you can tell?
"Our product replaces high-value white collar workers."
That's the pitch. Not to you. To investors.
Here's how the math works in every pitch deck you'll never see: "Our AI replaces ten people making $150K each. That's $1.5M in value per customer. There are 500,000 companies that fit our ICP. That's a $750B TAM."
Cue the standing ovation from Sand Hill Road.
But here's the thing. That math only works if you believe the people disappear.
And I don't. (Spoiler: neither does 200 years of economic history.)
Enter Jevons Paradox
In 1865, an economist named William Stanley Jevons noticed something weird. England had just made steam engines way more efficient at burning coal. Everyone assumed coal usage would go down.
It went up. Way up.
Because when something gets cheaper and more efficient, people don't just do the same amount of it. They do more. Way more. New use cases emerge. New industries form. Demand explodes.
This isn't some obscure footnote. This is one of the most well-documented patterns in economic history. And it applies directly to what's happening with AI right now.
This has happened before. Every single time.
Let me give you a few examples that should feel familiar.
ATMs were supposed to kill bank tellers.
When ATMs rolled out in the 1970s and 80s, everyone assumed bank tellers were done. A machine that dispenses cash? Pack it up, Karen from the third window.
What actually happened: the number of bank tellers went up. ATMs made it cheaper to open bank branches, so banks opened more of them. And those branches needed people. The role shifted from counting cash to advising customers and selling financial products. The job didn't disappear. It evolved and expanded.
Spreadsheets were supposed to kill accountants.
VisiCalc and then Excel automated calculations that used to take teams of people days to complete. The fear was real. Why hire an accountant when a spreadsheet does it faster? (Turns out, because someone still needs to explain to the CEO why the spreadsheet says they're broke.)
What actually happened: the number of accountants exploded. Suddenly every small business could afford to do serious financial analysis. The demand for people who could interpret, strategize, and advise around those numbers grew far beyond what existed before. The tool didn't replace the person. It created a bigger market for the person.
Cloud computing was supposed to kill ops engineers.
"You don't need a server room anymore. You don't need sysadmins. Just put it in the cloud." That was the pitch. Somewhere, a sysadmin reading this just felt a chill.
What actually happened: DevOps became one of the fastest growing roles in tech. The infrastructure got more complex, not less. Someone still needs to architect it, secure it, optimize it, and keep it running at 3am when the pager goes off. The tools got better. The demand for people who understand them got bigger.
The internet was supposed to kill retail jobs.
E-commerce was going to make stores irrelevant. No more cashiers. No more salespeople.
What actually happened: the internet created an entirely new category of retail jobs. Fulfillment centers, logistics, customer experience, digital marketing, content creation, social media management. The U.S. has more retail-adjacent jobs now than before Amazon existed.
The pattern is always the same. The technology makes something cheaper. Cheaper means more people use it. More usage means more demand. More demand means more jobs. Different jobs, sometimes. But more of them.
Every. Single. Time.
So why does the "jobs are going away" narrative persist?
Because it's useful. Not to you. To the people raising money.
If you're an AI lab trying to justify a $100B+ valuation, the story has to be enormous. "We help people be a bit more productive" doesn't exactly make a venture capitalist reach for their checkbook. "We replace entire categories of workers" does.
It's not even that they're lying exactly. It's that the framing is self-serving. When the CEO of an AI company talks about pricing their product based on "the cost of the worker it replaces," that's not an economic insight. That's a sales pitch wearing a lab coat.
And look, I get it. VCs need big narratives to deploy big checks. Founders need those checks to build. It's how the game works. I'm not mad at it.
But we don't have to internalize their fundraising deck as our worldview. You wouldn't take career advice from a company whose business model depends on you not having a career.
The real opportunity is expansion, not replacement
Here's what I think is actually happening, and it's way more exciting than the doom narrative:
AI is about to make millions of people capable of things they couldn't do before.
Not because it replaces their skills. Because it augments them.
A marketer who couldn't write code can now build internal tools. A small business owner who couldn't afford a legal review can now get a solid first pass. A student who couldn't afford a tutor can now get one-on-one help at 2am. A solo founder who couldn't afford a team of ten can now ship like they have one.
That's not replacement. That's expansion. That's Jevons Paradox playing out in real time.
And when you expand what's possible, you don't get fewer jobs. You get new ones. Ones that don't have names yet. Ones we can't predict because they'll be created by the very people we're currently telling to be afraid.
The self-fulfilling prophecy problem
Here's what actually scares me. Not AI. The narrative around AI.
Because narratives shape behavior. If every developer believes their job is going away, they stop investing in their craft. Companies freeze hiring because "AI will handle it." Students pivot away from computer science. Organizations delay projects because they're "waiting for AI to get better."
Congratulations. We just created a recession with vibes.
And then what happens? A slowdown. Not because the technology demanded it. But because we collectively talked ourselves into it.
That's the real danger. Not that AI takes our jobs. That we give them away because we believed someone's Series C deck.
Techno-optimism isn't naive. Defeatism is.
I know "techno-optimism" gets a bad rap sometimes. People think it means ignoring problems or being blindly cheerful about technology.
That's not what I'm talking about.
I'm talking about looking at 200 years of economic history and recognizing a pattern. Every major technology wave has created more prosperity, more jobs, and more opportunity than it displaced. Not without pain. Not without transition. But the net effect has always been expansion.
The printing press didn't kill scribes and create nothing. It created an entire publishing industry, literacy movement, and eventually the modern knowledge economy. (Sorry, scribes. But also, you're welcome, everyone who can read.)
The automobile didn't just kill horse-related jobs. It created suburbs, supply chains, tourism, and an entire middle class built around manufacturing and infrastructure.
The internet didn't just kill some jobs. It created millions more. Including "influencer," which honestly no one saw coming.
AI will be the same. If we let it.
The key phrase being: if we let it.
We create the world we choose to see
This is the part I feel most strongly about.
Right now, we're at a crossroads. The technology is powerful. The potential is enormous. But the direction it goes depends on the story we tell ourselves about it.
If we collectively decide that AI is a tool for replacement, that's what it'll become. Companies will use it to cut headcount. Workers will be treated as costs to eliminate. And we'll build a smaller, meaner version of the future.
But if we collectively decide that AI is a tool for expansion, the math changes completely.
More people building. More problems being solved. More small businesses competing with big ones. More individuals with capabilities that used to require entire teams. More creativity, more experimentation, more shots on goal.
That's not wishful thinking. That's what happens every single time we make a powerful capability cheaper and more accessible. The demand curve does what it always does. It goes up.
My ask to developers
If you're reading this on dev.to, you're probably someone who builds things. Someone who has influence over how technology gets used and talked about.
So here's my ask:
Stop repeating the AI doom talking points as if they're settled science. They're not. They're marketing.
When someone at your company says "should we even hire for this role, won't AI handle it?" push back. The answer is almost always that AI will make that person more productive, not unnecessary.
When you see a headline about AGI replacing all developers, ask yourself: who benefits from me believing this? Follow the money. It usually leads to someone with a cap table, a pitch deck, and a very specific number they need you to be scared of.
And when you're building with AI, build for expansion. Build tools that make more people capable. Build products that create new possibilities instead of just automating old ones.
Because the builders who define this era won't be the ones who used AI to cut costs. They'll be the ones who used it to create things that didn't exist before.
The jobs aren't going away. They're going to multiply in ways we can't yet imagine. But only if we choose to believe that and build accordingly.
What do you think? Am I being too optimistic, or is the doom narrative really just a fundraising strategy that we've all accidentally internalized? I'd love to hear from people who are actually building with AI every day.





Top comments (16)
I think the fundraising narrative point is underrated. A lot of the loudest “AI will replace everyone” messaging sounds less like sober analysis and more like a market-sizing story told to justify huge valuations.
From what I’ve seen, AI does not remove the need for people who understand systems. It mostly changes who can start, how fast they can build, and where the bottlenecks move.
I’ve already seen non-technical people build small internal tools, prototypes, and workflows they could not have built before. That part is real. But I’ve also seen the other side: once something touches real users, real data, messy business rules, integrations, security, tracking, and actual consequences, the need for judgment goes up, not down.
That is why I think “replacement” is too simple as a frame.
What seems more likely is:
AI lowers the barrier to creation, which means more people build.
More people building means more products, more experiments, more noise, and more competition.
And that usually increases the value of people who can connect technology to business reality, not just produce code.
So I’m much closer to your “expansion” view than the doom view.
The part I worry about most is actually the narrative effect too. If developers, founders, and companies internalize the idea that human capability no longer matters, they start making worse decisions before the technology has even forced anything.
Everyone becomes the CEO of their role, with a team under them…. Team of agents or models
Completely agree - history DOES repeat itself, and history shows that all the earlier technical breakthroughs/revolutions haven't caused mass unemployment - although they did cause disruption, temporarily ...
Good point as well about the reason WHY these doom & gloom stories keep cropping up :-)
An AI replacing jobs narrative is just the most recent excuse for layoffs and not hiring juniors. Only there have been waves of layoffs before AI. And it's not AI who started a war that causes global economic crisis. Thanks for adding the historical perspective.
This framing is spot-on. I run an automation consulting business and the pattern I see daily is: AI doesn't replace the developer, it changes what 'developer work' looks like. The tasks that get automated are the repetitive, well-defined ones — the stuff nobody enjoyed doing anyway. What remains is the harder, more valuable work: understanding the business problem, designing systems, handling edge cases the model has never seen. The Jevons Paradox angle is the key insight. Every automation I build for clients ends up creating more engineering work, not less — because once something is cheap and fast, people want more of it.
I think the real shift isn’t that jobs are going away — it’s that the leverage per person is changing fast.
A dev with good tools (and knowing how to use them) can now do what used to take a small team.
The narrative is overhyped, but the underlying change is very real.
What about non-dev’s?
Being told that marketers are cooked, yet they’re some of the highest users of AI. Thoughts?
Marketers aren’t cooked — average execution is.
AI makes it trivial to produce content, so the advantage shifts to people who know what to say and why.
And importantly — AI is trained on patterns that marketers created in the first place.
The next edge will come from people who figure out how to use those systems in new ways, not from the tools themselves.
The incentive analysis here is sharp. The loudest "AI replaces X" voices are almost always selling tools that require the fear to be validated. Less talked about: the loudest "AI replaces nothing" voices often work at places whose revenue model breaks if AI replaces something. Both sides have skin in the game. The honest answer is that AI changes what "doing the job" means faster than it eliminates jobs — and that's a much harder pitch deck to build.
The part of this that holds up: the narrative is doing real work for fundraising and for platform positioning. The part that's messier: the actual displacement risk isn't uniform across roles or tasks. What I've seen in practice is that AI collapses the time cost of certain execution tasks to near zero — boilerplate, scaffolding, format conversion — while doing almost nothing for the evaluation and judgment layer. The jobs most at risk are the ones that were mostly execution with thin judgment. The jobs least at risk are inverted. That's not 'your job is safe,' it's 'the composition of your job is changing faster than the job title is,' which is a harder thing to reason about but probably more accurate.
A reminder that not all “job replacement” narratives are objective—some are driven by hype cycles and incentives around funding and attention.
yeah, the fundraise-dependent doom loop is real. what IS changing is the work mix - more review and integration, less greenfield code. not replacement, just a different ratio.
Some comments may only be visible to logged-in visitors. Sign in to view all comments.