I've been watching the AI narrative closely. Building with it. Learning in public. Talking to developers, founders, and regular people trying to fi...
For further actions, you may consider blocking this person and/or reporting abuse
I think the fundraising narrative point is underrated. A lot of the loudest “AI will replace everyone” messaging sounds less like sober analysis and more like a market-sizing story told to justify huge valuations.
From what I’ve seen, AI does not remove the need for people who understand systems. It mostly changes who can start, how fast they can build, and where the bottlenecks move.
I’ve already seen non-technical people build small internal tools, prototypes, and workflows they could not have built before. That part is real. But I’ve also seen the other side: once something touches real users, real data, messy business rules, integrations, security, tracking, and actual consequences, the need for judgment goes up, not down.
That is why I think “replacement” is too simple as a frame.
What seems more likely is:
AI lowers the barrier to creation, which means more people build.
More people building means more products, more experiments, more noise, and more competition.
And that usually increases the value of people who can connect technology to business reality, not just produce code.
So I’m much closer to your “expansion” view than the doom view.
The part I worry about most is actually the narrative effect too. If developers, founders, and companies internalize the idea that human capability no longer matters, they start making worse decisions before the technology has even forced anything.
Everyone becomes the CEO of their role, with a team under them…. Team of agents or models
Completely agree - history DOES repeat itself, and history shows that all the earlier technical breakthroughs/revolutions haven't caused mass unemployment - although they did cause disruption, temporarily ...
Good point as well about the reason WHY these doom & gloom stories keep cropping up :-)
An AI replacing jobs narrative is just the most recent excuse for layoffs and not hiring juniors. Only there have been waves of layoffs before AI. And it's not AI who started a war that causes global economic crisis. Thanks for adding the historical perspective.
This framing is spot-on. I run an automation consulting business and the pattern I see daily is: AI doesn't replace the developer, it changes what 'developer work' looks like. The tasks that get automated are the repetitive, well-defined ones — the stuff nobody enjoyed doing anyway. What remains is the harder, more valuable work: understanding the business problem, designing systems, handling edge cases the model has never seen. The Jevons Paradox angle is the key insight. Every automation I build for clients ends up creating more engineering work, not less — because once something is cheap and fast, people want more of it.
I think the real shift isn’t that jobs are going away — it’s that the leverage per person is changing fast.
A dev with good tools (and knowing how to use them) can now do what used to take a small team.
The narrative is overhyped, but the underlying change is very real.
What about non-dev’s?
Being told that marketers are cooked, yet they’re some of the highest users of AI. Thoughts?
Marketers aren’t cooked — average execution is.
AI makes it trivial to produce content, so the advantage shifts to people who know what to say and why.
And importantly — AI is trained on patterns that marketers created in the first place.
The next edge will come from people who figure out how to use those systems in new ways, not from the tools themselves.
The incentive analysis here is sharp. The loudest "AI replaces X" voices are almost always selling tools that require the fear to be validated. Less talked about: the loudest "AI replaces nothing" voices often work at places whose revenue model breaks if AI replaces something. Both sides have skin in the game. The honest answer is that AI changes what "doing the job" means faster than it eliminates jobs — and that's a much harder pitch deck to build.
The part of this that holds up: the narrative is doing real work for fundraising and for platform positioning. The part that's messier: the actual displacement risk isn't uniform across roles or tasks. What I've seen in practice is that AI collapses the time cost of certain execution tasks to near zero — boilerplate, scaffolding, format conversion — while doing almost nothing for the evaluation and judgment layer. The jobs most at risk are the ones that were mostly execution with thin judgment. The jobs least at risk are inverted. That's not 'your job is safe,' it's 'the composition of your job is changing faster than the job title is,' which is a harder thing to reason about but probably more accurate.
A reminder that not all “job replacement” narratives are objective—some are driven by hype cycles and incentives around funding and attention.
yeah, the fundraise-dependent doom loop is real. what IS changing is the work mix - more review and integration, less greenfield code. not replacement, just a different ratio.