Who Needs an Engineer?
AI is rewriting the job description for engineers and web developers. But I don't think it’s a completely dire story of unemployment and replacement.
In “The New Craft Gap,” I wrote about how AI is eliminating the junior production work that trains designers. The same dynamic is of course hitting a lot of occupations, but when it comes to building for the web, things are way more interesting. Design can at least point to taste and feeling as irreducibly human. Engineering has always marketed itself on outputs. If the site ships and it works, does anyone care who or what built it? But web development still requires a lot of expertise to produce good work, new expertise that provides a great opportunity for those who are seasoned by years of dedicated work in code and those interested in jumping in. What it means to build for the web will never be the same.
The Engineering Craft Gap
Let’s look at some statistics.
A Stanford Digital Economy Lab study analyzed payroll records from ADP, the largest payroll processor in the United States, tracking millions of workers across tens of thousands of companies. Employment for software developers aged 22 to 25 has fallen nearly 20% from its peak in late 2022. Over the same period, employment for developers over 30 held steady or grew. The researchers titled the paper “Canaries in the Coal Mine.”
Junior hiring at the fifteen largest tech companies dropped 25% from 2023 to 2024. New graduates now represent only 7% of Big Tech hires, down from 32% in 2019. Computer engineering graduates face a 7.5% unemployment rate, higher than fine arts majors. That last number deserves a moment to land.
The pattern is similar to what I described in design: the industry demands senior engineers while eliminating the junior work that produces them. Every experienced web developer I know learned their craft by building a lot of broken things: wiring up CRUD apps, wrestling with responsive layouts, debugging API integrations that failed silently in production. It’s the same pipeline of unglamorous work and incremental skill-building that can easily be automated.
An IEEE Spectrum piece put it plainly: with AI tools performing more of the grunt work that served as a training ground for early-career workers, expectations for recent graduates are high, and the pathway to meeting those expectations is narrower than ever. One Harvard CS professor they interviewed noted that juniors now need to slot in at a higher level almost from day one.
If you don’t hire junior developers, you’ll someday never have senior developers. I keep hearing this line, but I don’t completely buy that junior developers won’t have a place. I think it’s more that the work they are trained to do is going to change, but there’s still going to be things for them to do. More on that later.
The Klarna Lesson
What happened at Klarna, the fintech company, is a very instructive recent case study in AI replacement. It comes from outside engineering proper, but it lands squarely in the middle of this conversation.
Klarna made a very public bet on replacing human workers with AI. The CEO declared that AI could do all human jobs. The company stopped hiring, shrank from roughly 5,500 employees to 3,400 through attrition, and trumpeted the cost savings. Customer interactions were routed through bots. Resolution times improved on paper. Labor costs dropped.
...And then the quality collapsed. Customer satisfaction scores fell, repeat contact rates climbed, and the AI handled volume but not complexity. By mid-2025, Klarna was scrambling to rehire. Software engineers and marketers were reassigned to answer customer service calls while the company rebuilt what it had dismantled. The CEO admitted they had focused too much on efficiency and cost, and that the result was lower quality.
An IBM survey of 2,000 CEOs found that only one in four AI projects delivers the return on investment it promised. Fifty-five percent of companies that executed AI-driven layoffs now regret it. These aren’t cautionary anecdotes. They’re the emerging baseline.
The lesson certainly isn’t that AI is useless. It’s that treating AI as a direct replacement for human judgment, rather than a tool that human judgment directs, tends to produce the same result: fast output, declining quality, and expensive reversal.
But there is a middle path forward, and I think it’s super interesting and exciting.
Harness Engineering
The term “harness engineering” started circulating in late 2025 and has become the dominant framework for how engineering teams are reorganizing around AI agents. The core idea is simple enough to state: the engineer’s primary job is no longer writing code. It’s designing the environment in which AI agents write code reliably.
OpenAI built a production application with over a million lines of code in five months. No human wrote any of it directly. A small team of three engineers guided Codex agents through pull requests and continuous integration workflows, averaging 3.5 merged PRs per engineer per day. The engineers described their role as “designing environments, specifying intent, and building feedback loops.” The product shipped and is used daily by hundreds of internal users.
That’s a staggering result. But the interesting part is not the volume. It’s what the team found was necessary to make it all work together.
They needed strict architectural boundaries enforced by automated tests. They needed a structured documentation system where every piece of institutional knowledge was written into the repository itself, because agents can’t read hallway conversations. They needed background processes that scanned for stale documentation and opened cleanup PRs automatically. Knowledge that lived anywhere other than the codebase was invisible to the system and therefore functionally nonexistent.
Anyone who has maintained a web project of any real complexity will recognize these problems. Dependency rot, undocumented build steps, CSS that only one person understands, API contracts that drift from reality. These are the things that make web codebases fragile over time, and they’re exactly the things that break agent-driven workflows. The harness engineering discipline is, in many ways, the web development best practices we always said we’d get around to, now enforced because the agents literally can’t function without them.
Mitchell Hashimoto, the creator of Terraform, has described the core principle this way: anytime you find that an agent makes a mistake, you take the time to engineer a solution so that the agent never makes that mistake again. That discipline, applied consistently, compounds. The harness improves, the agents get more reliable, and the work gets faster.
This is where the expertise gap is different from design. A senior developer who understands systems, architecture, failure modes, and how web codebases drift over time is exactly the person who can build these environments. But a junior engineer can also be trained in many of these new paradigms and disciplines alongside directly writing lines of code. Having spent my career straddling both worlds, I think design is more subjective, more nuanced, and requires a longer accumulation of micro-decisions to attain mastery. Engineering—no less or more difficult of course—has a cleaner, more optimistic path forward for newcomers.
Agents Watching Agents
One of the more fascinating developments in this space is the growing practice of using models to verify the work of other models. The logic is the same as classic code review: a second set of eyes catches things the first set missed, especially when the second set has a different perspective.
In practice, this takes several forms. Teams run one model to generate code and a different model to review it. They use structural tests and custom linters as automated feedback loops that catch architectural violations before they reach the main branch. Some teams dispatch work to sub-agents for specific tasks like research or implementation, keeping the primary agent’s context window clean and focused. Chroma’s research on context degradation supports this approach: models perform worse at longer context lengths, so breaking work into discrete, well-scoped tasks and delegating them produces better results than feeding everything into one gigantic conversation.
The broader pattern is that engineering is becoming less about writing and more about orchestration, and orchestration can be learned at the junior level. The engineer defines the constraints. The agents operate within them. When something breaks, the engineer doesn’t fix the code. The engineer fixes the system that produces the code.
LangChain demonstrated this concretely when their coding agent jumped from 52.8% to 66.5% on a benchmark by changing nothing about the underlying model. They changed only the harness: the context management, the feedback loops, the architectural constraints. Same engine, better track. The track is now the engineering.
What Gets Built, and By Whom
In artistic fields, the threat from AI is that it produces something aesthetically passable but devoid of heart, and the masses have historically been fine with passable. Web development has a different relationship with quality. Code that looks fine but breaks across browsers, falls apart on mobile, chokes under traffic, or exposes user data doesn’t get the same shrug that a bland logo does. The consequences are more immediate and more expensive, which means the incentive to maintain quality is structural, not just ideological.
For junior engineers, the entry points indeed are narrowing. But those who focus on understanding how to build harnesses, design feedback loops, and structure work for agents will be in enormous demand. That skill set didn’t exist two years ago, so it’s open to all.
The question is whether the industry will create pathways for new engineers to develop those skills, or whether it will simply demand senior people and wonder, five years from now, why there aren’t enough of them. I don’t think the outcome is predetermined. But there is a new kind of engineering emerging, with genuinely different skills and genuine demand. It’s not a guarantee, but it’s a door.
Engineers who will thrive need to start thinking of themselves as people who design systems that produce code. That shift in identity can be uncomfortable. It means letting go of the thing that drew most of us to development in the first place: building something with your own hands, line by line, watching it come to life. But there can be a lot of enjoyment in tinkering on the systems layer too.
This is the fourth installment of Dispatches on AI, a series on how generative AI is reshaping design, development, and the integrity of the web:
Exploring AI’s Impact on Digital Creation (series intro)
Who Needs an Engineer?
Nobody Can Keep Up
Subscribe to get each post in your inbox.
Sources and Further Reading
Stanford Digital Economy Lab, “Canaries in the Coal Mine”: Stanford Digital Economy Lab | TIME | CNBC
Junior hiring decline: IEEE Spectrum | Stack Overflow
Klarna AI reversal: Fortune | Fast Company | Futurism
Harness engineering: OpenAI | Martin Fowler / Thoughtworks | HumanLayer
LangChain benchmark improvement: HumanLayer
Context degradation research: Chroma
IBM AI ROI survey: Fortune / IBM


Ryan states that AI rewrites the job (from "writing code" to "designing the systems that produce code"), narrows traditional junior pipelines, but creates demand for new skills in orchestration/harness design. Which is cool, because it's not a dire unemployment story—seniors thrive, and motivated newcomers can enter via this cleaner path (especially vs. more subjective fields like design). Web apps still demand rigor (cross-browser, security, etc.), so human judgment remains essential. Industry needs to build training pipelines or risk future senior shortages.
The facts and examples in this article are solid. The interpretation is reasoned and optimistic but not unsubstantiated—it's grounded in the cited evidence (e.g., OpenAI's success + Klarna's failure). It acknowledges risks (junior squeeze, quality pitfalls) while highlighting opportunities. No wild extrapolations or unsourced hype.
If you are a skeptic, I understand (AI job impact claims often overpromise), but Ryan's piece holds up exceptionally well. It's not clickbait; it's a thoughtful dispatch from someone in the design/dev space, backed by peer-reviewed labor data, company admissions, and official technical write-ups. The core shift to "harness engineering" is already happening at places like OpenAI, and the labor stats reflect real 2024–2025 trends.
Thank you, Ryan.