My bet on where software engineering is headed. Not an analysis. A direction.
Last updated: March 2026I've spent weeks trying to write this post. I started it three times and deleted it each time because it sounded like something I'd already said.
Which makes sense. I've been writing for months about how AI is changing software engineering. About the opaque code already running in production. About the heuristics seniors can't explain. About why verification is the new core work. About why that verification needs infrastructure, not discipline. But all of that was loose pieces. Diagnoses. Concrete solutions to concrete problems.
What I hadn't written was the thing that connects all of it. The underlying question: where is this profession headed? Not tomorrow, not next week. In three, five, ten years.
This is as close as I have to an answer. It's not an analysis. It's a personal manifesto. My bet.
I mentioned it briefly in the piece on verification as infrastructure, but I want to expand on it because that was the moment something clicked.
A non-technical person in my network used AI to build a working tool. Not a prototype. Not a demo. Something with tests, with a clean interface, that did what it was supposed to do. My first reaction was awe. My second was a chill: “so what now?”
Because that software worked. But I had no way of knowing whether it should reach production without applying a judgment that person had no reason to have. Is it consistent with the project's conventions? Does that dependency behave the same under real load? Does it fit into the broader system?
No-code had been promising this for years. It didn't deliver because it gave you limited blocks. Generative AI is delivering because it gives you a collaborator that interprets what you're trying to do. And that changes everything.
I'm not going to argue about whether this will happen or not. I already did that. It's going to happen. It's happening. The question that interests me now is different: if we accept that more and more people are going to create software without deep technical training, what do we build so that software is real software?
When you raise this in a conversation among engineers, there are two immediate reactions.
The first: “that will never work, non-technical people don't understand systems, they're going to create a disaster.” And they're partly right. Software generated without context can be a disaster. But the conclusion of “therefore, don't let them do it” is the same thing people said when they didn't want users to use frameworks, or cloud platforms, or high-level languages. Every time the barrier to entry dropped, someone said it was a bad idea. And every time, the barrier dropped anyway.
The second: “perfect, so engineers are the quality gatekeepers, the ones who review and verify.” And this is where I see the trap. Because verification is necessary (I wrote a whole series on it), but it's not enough as a vision for the future. Verification is reactive. Someone generates, you check. You're behind, not ahead.
I don't want to be behind.
For a long time I used the marathon metaphor to describe this profession. You can never stop learning, updating, running, because if you stop you fall behind. And that's true. But it's a reactive metaphor. It's survival. It's responding to what comes at you.
What I'm proposing is a different stance: don't run the marathon, build it. Design the next stretch of road so others can travel it. Engineers, non-engineers, agents, whoever. And once that stretch stabilizes, move up a level and build the next one.
What does that mean in practice? It means the work of engineering shifts from writing software to building the infrastructure that lets anyone create real software. Not just verifying (that's one piece). Building the systems, the platforms, the guardrails, the contracts that make software generated by anyone operable, maintainable, and coherent with its context.
And someone will say: “that's platform engineering, people have been talking about that for years.” Yes and no. Platform engineering solves the technical infrastructure piece: pipelines, deployments, environments. That's fine, but it's only one dimension. The one that obsesses me is different.
In my recent posts I've talked about three dimensions of what makes software real software: functional (does what it's supposed to), craft (is well built), and contextual (fits this team, this company, this moment).
The first two have clear paths. The functional dimension gets validated with tests. The craft dimension gets enforced with linters, templates, good platforms. Not trivial, but the problem is defined.
The contextual dimension is the frontier. And I believe that's where the future of this profession lives.
Context is: this service has a dependency that degrades under load and doesn't show up in any test. Context is: we're migrating from this architecture to that one, don't invest here. Context is: the team that will maintain this has these capabilities and not others. Context is: this component is touched by another team and there's tension over who owns it.
Today, all of that lives in the heads of people like me. In hallway conversations. In the intuition of someone who's been on the project for years. It's tacit knowledge — the kind seniors can't explain.
My bet is that the future engineer's job is to turn that tacit knowledge into consumable infrastructure. Not just for other engineers (which is what we already do with RFCs and design principles), but for agents and non-technical profiles who will generate software without that context in their heads.
Is it easy? No. Can it be done completely? I'll be honest: I don't know.
And this is where my position diverges from most of what I read.
The natural reaction to automation is to find the gap the machine can't fill. “AI will never understand organizational context.” “AI will never have judgment.” It's reassuring. It's also a trap.
Not because I'm convinced AI can do all of that tomorrow. But because looking for the irreplaceable is a defensive stance. It's finding your trench and staying in it. And trenches in technology have expiration dates.
My proposal is the opposite: instead of looking for what AI can't do, be the one who builds the infrastructure that lets it do it. Every time you manage to codify a layer of judgment that was previously tacit — a design principle, a verification contract, a contextual guardrail — that layer stops depending on someone remembering it. It becomes a system. And you move up a level and attack the next layer of judgment you don't yet know how to codify.
Plumbline was my first attempt to do this with verification. But verification is only the beginning. Architectural context, strategic context, organizational context — all of it needs the same treatment: stop living in heads, start living in infrastructure.
Does that have a ceiling? Will there come a point where we've abstracted so much that an engineer is no longer needed to build the next layer? Maybe. Maybe that's the end of the profession as we know it. It would be dishonest not to acknowledge that.
But that doesn't change what needs to be done today.
Carry tacit context into explicit infrastructure. Make the decisions that today live in hallway conversations codified into systems that anyone can consume. Build the guardrails that allow more people (and more agents) to create software that works, is well built, and fits its context.
Not as a threat. As the natural evolution of a profession that has always been about the same thing: making accessible what was previously impossible.
What changes is who you're building for. Before, for other engineers. Now, for anyone.
That's my bet. That's the direction where I'm going to invest the next few years. I don't know how far it goes. But I know it's the right direction.
And of that I have no doubt.
Are you building the road or running it?
That's not a rhetorical question. I genuinely want to know how you're seeing this from your team, your company, your position. Because I think the answer will look very different depending on each person's context — and that's exactly what makes this so interesting.