Worse Is Better
One of the coolest aspects of software development is the way it makes really big theoretical questions around system design totally accessible. Every teenager building a web app is deploying an ad hoc bureaucracy of computer processes. As you get more experience, you start to think critically about how it all fits together.
I remember reading “The Rise of Worse is Better” a decade ago. The first time I read it, it felt like a call to action. It’s the story of how less elegant designs have come to dominate the world of computing, outcompeting and rendering irrelevant so many other, more brilliant systems. I felt compelled to investigate these alternate threads of history. I wondered what the world would look like if The Right Thing had won at each step along the way to our modern tech landscape.
Toward the end of high school I started taking math at The Ohio State University. This meant that, while still a high schooler, I now had access to the university libraries. I’d often come home with a stack of books on things like lambda calculus, Lisp, and the history of computing. When I started undergrad, now at Harvard not OSU, I took lots of classes in the Computer Science and Math departments, but also Government and History. Ultimately I got my degree in History & Science, and my interest in the history of technology grew from hobby to full-time focus (for a couple years).
What I came to realize, as I explored these ideas in the context of a longer run of history, is that Worse Is Better is no strange quirk of computing. Taken at its broadest, Worse Is Better is the case for liberalism over authoritarianism. In both cases—software development and government—the system that wins isn’t necessarily more competent than its alternative, but it’s usually the more capable of embracing change. In the case of liberalism, it’s worth tolerating a lot of idiocy to preserve dynamism.
Returning to software development, the tradeoffs are basically the same as in government. Much has been written on the topic of software development in a modern context, but what I found as I dug into the past, is that some of the most interesting theory on the subject is found under the banner of cybernetics. Although it’s not a word you see much these days, cybernetics was the original name for system design in the context of man-machine collaboration. So software development is, essentially, an endeavor in cybernetics.
Believe it or not, the central comparison of this essay is immanent in the etymology of these two subjects: both “government” and “cybernetics” come from the Greek kybernan, “to steer.” Gregory Bateson, one of the many scientists involved in the cybernetic project back in the 1950s, was disappointed by how the field evolved as computers went mainstream:
Computer science is input-output. You’ve got a box, and you’ve got this line enclosing the box, and the science is the science of these boxes. Now, the essence of Wiener’s cybernetics was that the science is the science of the whole circuit … And you’re not really concerned with an input-output, but with the events within the bigger circuit, and you are part of the bigger circuit.
“For God’s Sake, Margaret” in CoEvolution Quarterly, June 1976
As an undergrad reading Bateson’s papers in the archives at UC Santa Cruz, I once again found myself seduced by an alternate thread of history, now at a higher level of abstraction. Bateson is basically saying that True Cybernetics was not embraced by practitioners. In the move from the seminal ideas of cybernetics into Computer Science, it seems once again that the worse thing won. It’s easier to reason about a software system as if you’re producing a static artifact, but of course, it’s more accurate to conceive of a software system and the process of its creation as one larger system. This larger system is the true undertaking of developers.
I now believe that a synthesis is possible. For idealistic system designers, like the ones who fought for Lisp in a world of Unix, a resolution is possible. It’s not really that worse systems win. Rather, most of the visionary designs from the early days of software had ambitions that were too narrow. The Unix design was The Right Thing when taking on a wider perspective. And it’s possible to design systems with the big picture in mind. You just need to make sure that you and your colleagues and the larger community are all included in that picture.
When the founding fathers set out to design a democratic system, they had no delusions that they were designing a static artifact. In fact, their focus was not on a specific telos at all. Instead, they focused on the role they were playing (“When in the Course of human events…”), how our nation’s constitution could be evolved over time, and how this new system could be implemented without arrogating too much power to those managing it, only to fall into the same failure modes of the old world.
There’s a lesson in there. The most important problem in system design is making sure it will evolve effectively, and it’s as much a social problem as a technical one. This has to be the focus when designing The Right Thing, because, as always, the only constant in life is change.