Premature Optimization is the Root of All Evil

This famous quote, usually attributed to Tony Hoare, was popularized by Donald Knuth's book Structured Programming with go to Statements, published in 1974. That was 49 years ago.

In dog years, that's about 200 human years. In software-industry years, it's more like a millennium. Computers have changed drastically since then.

Does this rule still apply?

Back then Knuth might have been working with a computer such as the PDP-10 KA10, a machine that offered a maximum of 1152kb of memory, run at about 1mhz, weighted about 870kg and cost over half a million dollars.

In that context it makes sense that programmers would try to optimize everything by default, since every processor cycle and every byte of memory counted.

Fast-forward half a century: hardware is ridiculously fast and dirt cheap and the industry is dominated by CRUD applications (what we, frontend and backend devs, do 99% of the time), which are IO bound rather than CPU bound.

Donald Knuth c. 1958

Furthermore, the few CPU optimizations we do need, such as with React's virtual DOM, are hidden away from every-day developers inside libraries and frameworks developed by a few specialized ones.

CPU optimizations are largely unnecessary now, and thus ignored by default by today's programmers.

And yet, still this lesson comes back to haunt us. How can it be?

It's simple: any premature investment of effort and time is generally a bad idea. This applies very much to premature architecture, for example — for pieces of software which may get discarded a week later; or overly-enthusiastic early design discussions, which can easily spiral out of control and evolve into bike-shedding.

PDP-10 CPU, model KI-10. Has anyone ported Doom to this yet?

Finally, here's an extract of Structured Programming with go to Statements — the two paragraphs surrounding the famous quote:

There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified. It is often a mistake to make a priori judgments about what parts of a program are really critical, since the universal experience of programmers who have been using measurement tools has been that their intuitive guesses fail.


PDP-10 image: source, lincense

A newsletter for programmers

Yo! This is Taro. I've been doing JavaScript for years and TypeScript for years. I have experience with many programming languages, libraries, frameworks; both backend and frontend, and in a few company roles/positions.

I learned a few things over the years. Some took more effort than I wish they had. My goal with this blog and newsletter is to help frontend and backend developers by sharing what I learned in a friendlier, more accessible and thorough manner.

I write about cool and new JavaScript, TypeScript and CSS features, architecture, the human side of working in IT, my experience and software-related things I enjoy in general.

Subscribe to my newsletter to receive notifications when I publish new articles, as well as some newsletter-exclusive content.

No spam. Unsubscribe at any time. I'll never share your details with anyone. 1 email a week at most.

Success!
You have subscribed to Taro's newsletter
Shoot!
The server blew up. I'll go get my fire extinguisher — please check back in 5.