The Software Industry at Scale (A Few Observations From Someone Getting Old)

I want to be careful here about falling into the "things were better in my day" trap, because they mostly weren't — the tools are better, the community knowledge is better, the accessibility of information is dramatically better. A self-taught developer starting out today has resources I genuinely could not have imagined in 1985.

But forty years is enough time to watch some patterns repeat, and I find those patterns more interesting than either the nostalgia or the hype. This is an attempt to describe a few of them without drawing the easy conclusions.


The Complexity Ratchet

Every few years, a new layer of abstraction arrives that makes a genuinely hard problem significantly easier. Networks got easier to manage. Deployment got easier (several times, in different ways). Frontend development got more powerful. Mobile development became accessible to small teams.

But each layer adds its own complexity. The things that are easier often require understanding the layer below to debug, and the layer below requires understanding the layer below that. A modern web application involves, at minimum: a JavaScript runtime, a bundler, a framework, component lifecycle management, API communication, authentication, a deployment pipeline, CDN configuration, monitoring, and whatever database abstraction is in fashion this year.

Each of these pieces is individually well-documented and supported by a community. The problem is that you need to understand all of them, and the surface area keeps growing. I used to joke that the total amount of knowledge required to ship a web application has roughly doubled every five years. I've stopped joking about it because it seems closer to true than funny.

The people who handle this best — developers I've watched and learned from — have internalized a rule: add a layer only when it solves a problem you actually have. Not when it's the new thing. Not when everyone else is using it. When you have the specific problem it's designed to solve, and you've confirmed that the cost of the abstraction is less than the cost of the problem.

That's a discipline that the industry as a whole is not good at. The default is to add layers because they're available and well-supported and the tutorials make them look simple. The complexity ratchet clicks forward. It rarely clicks back.


The Framework Treadmill

I've watched the JavaScript frontend ecosystem cycle through something like five major framework eras in the time I've been paying close attention to it. Each new entrant was genuinely better than what came before — I'm not disputing that React was an improvement over the soup that came before it, or that the subsequent evolution has addressed real problems. The technical progress is real.

The treadmill isn't technical. It's economic. The skills premium on knowing the latest framework creates pressure to always be learning the latest framework, which creates demand for new frameworks, which creates pressure to learn them. A developer who mastered React in 2017 and hasn't kept up is now "behind" in a job market sense, even if React (or its successors) is still perfectly capable of solving the problems they're being hired to solve.

This benefits a lot of people: framework authors, tutorial writers, bootcamps, conference organizers, technical recruiters. It doesn't obviously benefit the developers who have to stay on the treadmill to remain employable, or the businesses that absorb the cost of constant technology churn, or the users who are on the receiving end of applications that were rewritten in the new framework before the old one was finished.

I don't have a solution to this. It's probably structural — the incentives that drive it are too distributed to be easily changed. What I've done is opt out, partially, by staying in spaces where I control the technology decisions and can make them on the basis of requirements rather than fashion. The cost of that is being out of date on some things. The benefit is that the things I build tend to work for a long time without major rewrites.


The Abstraction Leak Accumulation Problem

Joel Spolsky's law of leaky abstractions — all non-trivial abstractions, to some degree, are leaky — is twenty years old and still the most useful single observation about software development I've ever read. The abstractions we work with every day are leaky, and when they leak, we need to understand what's underneath.

What I've noticed in the years since is a corollary: the further you are from the underlying layer, the more catastrophic the leak when it happens. A developer who understands SQL can debug an ORM behavior in minutes. A developer who has only ever worked with the ORM and considers SQL an implementation detail might spend hours on the same problem. The developer who doesn't know that there's a database under the ORM is in a genuinely difficult position.

Over the decades, the industry has made a consistent bet on going up the abstraction stack — higher-level languages, more powerful frameworks, more managed services. This has made a lot of things faster to build. It has also produced a generation of developers (and I'm being careful here — not worse developers, but differently situated developers) who have less facility with the layers below their working level.

Whether this matters depends on the work. A lot of work doesn't require ever going below the working layer. The work that does matter — debugging mysterious performance problems, recovering from infrastructure failures, debugging the weird race condition that only happens under load — is work where understanding the layers below is what separates a fast resolution from a slow one.

I'm not arguing that everyone needs to know how memory allocation works or how network packets are structured. I'm arguing that the layers directly below your working layer are worth understanding, and that the trend toward higher abstraction has made people less likely to explore them.


What Stays the Same

The thing that consistently surprises me, watching the industry from forty years of perspective, is how constant the fundamental problems are.

We still have trouble writing software that's correct. Not for lack of better tools — the tools for testing, type-checking, formal verification, and static analysis are dramatically better than they were in 1985. But the software shipped by large companies is still buggy, in ways that feel embarrassingly basic.

We still have trouble estimating software projects. Every new planning methodology has addressed this and none of them have solved it. The reasons are structural: software projects involve large amounts of unknown unknowns, and we are systematically overconfident in our ability to identify the unknowns before we start.

We still have trouble communicating about software. Between developers, between developers and product managers, between product managers and the people who will use the thing. The best technical communication I've seen was from people who could hold the technical precision and the human stakes in their heads at the same time and speak to both. There's no tool that teaches this.

These problems have been with the industry for as long as there's been an industry. The ones that have improved over forty years are mostly the mechanical ones — the things that could be tooled or automated or formalized. The ones that remain stubborn are the human ones. Which is, I suppose, predictable.


The Part Where I'm Supposed to Offer a Takeaway

I don't have a sharp one. The observations above are observations, not a thesis. The industry is producing remarkable things and wasting enormous amounts of effort simultaneously. The tools are better than they've ever been and the average quality of what gets shipped is hard to assess against a consistent baseline.

If I had to say something useful from forty years of watching: be skeptical of any claim that a new tool or methodology has fundamentally changed the nature of the problem. It probably hasn't. The things that have genuinely moved the needle have been almost uniformly boring from a marketing standpoint — version control, automated testing, continuous deployment, code review — incremental improvements to process and tooling that compound over time.

The exciting announcements are usually less exciting on delivery. The boring improvements are usually better than anyone gave them credit for.

That's been true for forty years. I expect it to continue.