Moore’s Not Enough: ​4 New Laws of Computing

I teach technology and information-systems courses at Northeastern University, in Boston. The two most popular laws that we teach there—and, one presumes, in most other academic departments that offer these subjects—are Moore’s Law and Metcalfe’s Law. Moore’s Law, as everyone by now knows, predicts that the number of transistors on a chip will double every two years. One of the practical values of Intel cofounder Gordon Moore’s legendary law is that it enables managers and professionals to determine how long they should keep their computers. It also helps software developers to anticipate, broadly speaking, how much bigger their software releases should be.

Metcalfe’s Law is similar to Moore’s Law in that it also enables one to predict the direction of growth for a phenomenon. Based on the observations and analysis of Robert Metcalfe, co-inventor of the Ethernet and pioneering innovator in the early days of the Internet, he postulated that the value of a network would grow proportionately to the number of its users squared. A limitation of this law is that a network’s value is difficult to quantify. Furthermore, it is unclear that the growth rate of every network value changes quadratically at the power of two. Nevertheless, this law as well as Moore’s Law remain a centerpiece in both the IT industry and academic computer-science research. Both provide tremendous power to explain and predict behaviors of some seemingly incomprehensible systems and phenomena in the sometimes inscrutable information-technology world.

King Camp Gillette reduced the price of the razors, and the demand for razor blades increased. The history of IT contains numerous examples of this phenomenon, too.

I contend, moreover, that there are still other regularities in the field of computing that could also be formulated in a fashion similar to that of Moore’s and Metcalfe’s relationships. I would like to propose four such laws.

Law 1. Yule’s Law of Complementarity

I named this law after George Udny Yule (1912), who was the statistician who proposed the seminal equation for explaining the relationship between two attributes. I formulate this law as follows:

If two attributes or products are complements, the value/demand of one of the complements will be inversely related to the price of the other complement.

In other words, if the price of one complement is reduced, the demand for the other will increase. There are a few historical examples of this law. One of the famous ones is the marketing of razor blades. The legendary King Camp Gillette gained market domination by applying this rule. He reduced the price of the razors, and the demand for razor blades increased. The history of IT contains numerous examples of this phenomenon, too.

The case of Atari 2600 is one notable example. Atari video games consisted of the console system hardware and the read-only memory cartridges that contained a game’s software. When the product was released, Atari Inc. marketed three products, namely the Atari Video Computer System (VCS) hardware and the two games that it had created, the arcade shooter game Jet Fighter and Tank, a heavy-artillery combat title involving, not surprisingly, tanks.

Crucially, Atari engineers decided that they would use a microchip for the VCS instead of a custom chip. They also made sure that any programmer hoping to create a new game for the VCS would be able to access and use all the inner workings of the system’s hardware. And that was exactly what happened. In other words, the designers reduced the barriers and the cost necessary for other players to develop VCS game cartridges. More than 200 such games have since been developed for the VCS—helping to spawn the sprawling US $170 billion global video game industry today.

A similar law of complementarity exists with computer printers. The more affordable the price of a printer is kept, the higher the demand for that printer’s ink cartridges. Managing complementary components well was also crucial to Apple’s winning the MP3 player wars of the early 2000s, with its now-iconic iPod.

From a strategic point of view, technology firms ultimately need to know which complementary element of their product to sell at a low price—and which complement to sell at a higher price. And, as the economist Bharat Anand points out in his celebrated 2016 book The Content Trap, proprietary complements tend to be more profitable than nonproprietary ones.

Law 2. Hoff’s Law of Scalability

This law is named after Marcian Edward (Ted) Hoff Jr.—the engineer who convinced the CEO of Intel to apply the law of scalability to the design and development of processors. Certainly, the phenomenon of scalability was well known in the automobile industry before it made a significant impact on the computing industry. Henry Ford was a notable example of the application of this scalability law. Henry Ford’s company was perhaps the first company to apply this law on a grand scale. Ford produced the Model T, which was the first mass-produced car. At the core of Henry Ford’s achievement was the design of an automobile that was made for mass production. Ford’s engineers broke down the assembly process of the Model T into 84 discrete steps. The company standardized all the tasks and assigned each worker to do just one task, thus standardizing the work each worker performed as well. Ford further built machines that could stamp out parts automatically. Together with Ford’s innovative development of the first moving assembly line, this production system cut the time to build a car from 12 hours to about 1.5 hours. The Model T is probably the paradigmatic example of how standardization enables designing processes for scalability. [READ MORE]