The Competitiveness of Nations in a Global Knowledge-Based Economy


Paul Romer

Beyond the Knowledge Worker

Worldlink, January/February 1995.



The growing demand for knowledge

Creating a new set of instructions

The best managers foster software innovation


On a spring night in 1983, a chemist employed by the Cetus Corporation had a wonderful insight. While working on a narrow problem, Kary Mullis stumbled onto the principles of the polymerase chain reaction (PCR). The story behind this discovery is exciting because the stakes involved turned out to be so high. Mullis won a Nobel Prize. Molecular biologists were given a powerful research tool. Doctors got new diagnostic tests. Police investigators could begin to lift DNA “fingerprints” from small and badly degraded biological samples. Discoveries this dramatic are rare, but the underlying elements in the story have broad relevance. They illustrate an important aspect of economic activity that often goes unnoticed in discussions of what it is that managers do.

The activities of a business are typically understood by using the factory as a metaphor. The three basic inputs are assumed to be capital equipment, workers and raw materials. Workers are further subdivided into production and non-production workers. In this image, production workers do the actual work in the factory, aided by equipment. The instructions they follow are taken as given. One imagines that at some time in the past, an engineer trained in the principles of time-and-motion studies was called in to divide tasks between workers and specify actions. This implies that non-production workers such as managers are there merely to see that the production workers follow their instructions. In academic discussions, this image of management is formalised in terms of a “principal-agent” problem. The manager is the principal who must supervise the activities of hired agents who might shirk. Popularly, this image is summarised by the view that what supervisors do is “kick butt and take names.”

But the factory never gave a complete picture of economic activity. Its description of non-production workers as regrettable overhead does not give a complete picture of production within manufacturing and is even more misleading in other sectors of the economy. Non-production workers do other tasks that are more important than seeing that production workers stay on the job. These other tasks are becoming ever more important in economic activity. In all sectors — even in manufacturing — the ratio of non-production to production workers has been increasing. Both the number of non-production workers and their wages have been growing over time, so some underlying factor must be increasing the demand for whatever it is that non-production workers like Kary Mullis do. The key to understanding the changes that are taking place in the economy is to develop an abstract image of what that is.


The growing demand for knowledge

Years ago, Peter Drucker pointed to a distinction that is more useful than the familiar one of production versus non-production workers or workers versus supervisors. Some people, including most production workers, work with physical objects. Others work with intangibles. He called them knowledge workers. This is a start, for Mullis and people like him certainly produce knowledge. It remains for us to understand where the growing demand for knowledge is coming from.

According to recent debate the increase in demand is linked to the imminent arrival of the so-called information superhighway. In this analysis, production workers produce objects that get delivered to consumers on trucks; knowledge workers produce information that gets delivered to its consumers on a wire or through the air. The idea that knowledge workers are really just high-tech inputs into the entertainment industry fits with this industry’s exaggerated sense of self-importance. However, the total output from all of the industries that could conceivably send their products over a wire still account for a small fraction of economic output. Relatively few knowledge workers produce information that a consumer will enjoy watching, hearing or reading. Mullis certainly was not one of them.

Some people have made the more useful observation that the knowledge workers employed by the entertainment industry produce the software that runs on hardware systems such as home video players or cable networks. This points to a different and more fruitful way to understanding economic activity, one based on desktop computing.

The computing metaphor replaces the traditional categories of inputs (capital, raw materials, production and non-production workers) with three broad classes of inputs: hardware, software and wetware. Hardware includes all the physical objects used in production — capital equipment, computers, structures, raw materials, infrastructure and so on. Wetware captures what economists call human capital and what philosophers and cognitive scientists sometimes call tacit knowledge. It includes all the things stored in the “wet” computer of a person’s human brain.

Software includes all the knowledge that has been codified and can be transmitted to others: literal computer code, blueprints, mechanical drawings, operating instructions for machines, scientific principles, folk wisdom, films, books, musical recordings, the routines followed in a firm, the literal and figurative recipes we use, even the language we speak. It can be stored as text or drawings on paper, as images on film, or as a string of bits on a computer disk or a laser disk. Once the first copy of a piece of software has been produced, it can be reproduced, communicated and used simultaneously by an arbitrarily large number of people.

In the most general sense, what knowledge workers such as Mullis do is produce software. As the entertainment enthusiasts have noted, software can be used with hardware and wetware to please a consumer. Watching a video at home requires a video cassette recorder (the hardware), the software stored on the tape, and at least a bit of wetware — the knowledge of how to operate the VCR or at least where to find the operating instructions for it. A skier uses ski equipment (hardware), an innate sense of balance and learned physical responses (wetware) and instructions spoken to her by her instructor (software).

The problem with most discussions of the information superhighway is that they neglect the important role that software plays as an intermediate input in production. In the desktop environment from which the computer metaphor springs, a writer uses her skills (wetware), a word processing package (software) and a personal computer (hardware) to produce a document. This document could be a book that someone reads for pleasure, but it could also be an analysis of sales in the last quarter or a memo describing new repair procedures for service personnel.

Software has always contributed to production, even in the days before digital electronics brought it dramatically to the foreground. For example, in the textile factories of the last century, software for guiding the actions of a power loom was stored on wooden cards with holes punched in them. More broadly, workers followed explicit instructions that they learned from managers, teachers and colleagues.

Software is just as important in production that takes place outside the factory. In the last 100 years, for example, some improvements in agricultural productivity can be traced to new inputs such as tractors and chemical fertilisers produced in factories according to better software instructions. Other improvements can be traced to improved instructions about how to manage a farm. In the last century, these instructions have been developed through agricultural research and disseminated by agricultural extension services. But ever since the neolithic revolution, people have been accumulating software about how to grow crops and manage domesticated animals. It accumulated through trial and error and spread through face-to-face contact.

Because of software’s unique capacity for simultaneous use by an arbitrarily large number of people, an innovation in software can have an impact that is felt on a massive scale. Some of the most important transitions in human history arose from the discovery of new methods for copying, storing and transmitting software. Examples include the introduction of written language, printing with moveable type, telecommunications and digital information processing. Many other revolutions were triggered by the discovery of new instructions for working with raw materials. Discoveries of this type include cereal cultivation methods; recipes for making iron, bronze, gunpowder and steel; techniques for making complicated mechanical goods from interchangeable parts; or methods for converting mechanical energy into electrical energy and back.


Creating a new set of instructions

The polymerase chain reaction that Mullis discovered is a set of instructions for working with biological materials. The instructions themselves are remarkably simple: put a small quantity of DNA, perhaps even a single molecule, into a test tube. Add some reagents. Repeatedly heat and cool the tube. The result is astonishing. With each cycle of heating and cooling, the number of identical copies of the DNA molecule will double.

One of the remarkable facts about the discovery of this technique is that all of the steps needed to make it work had existed for 15 years before Mullis put them together in just the right sequence. For years, scientists had been using the basic techniques to make single copies of sections of DNA. No one realised how this process could be repeated over and over, or if they did they failed to understand the implications of repeated doubling. After the first cycle one molecule will be copied, leaving two identical molecules. After the next cycles there will be two, four, eight, 16 and so on.

Even after Mullis had done his first experiment and shown that his idea would work, many scientists did not appreciate its significance. In the first few cycles of the reaction, the increase in the quantity of DNA does not seem very impressive but it soon begins to grow dramatically. After 10 cycles, the quantity of DNA has increased by a factor of 2 times itself 10 times, which is roughly equal to 1,000. After 20 cycles it has grown by 2 times itself 20 times, which is about 1 million. After 30 cycles the factor is 1 billion.

Humans have difficulty understanding how quickly the numbers grow in a sequence based on repeated multiplication — including Mullis himself who first conceived of PCR while driving his car. Once he saw its implications he tried to calculate these expansion factors in his head. Because they seemed too big to be believed, he pulled over to the side of the road to check his multiplication.

While driving that night, Mullis created something very valuable — a list of instructions that he could write down on paper which others could follow. Because it could be codified and transmitted to others, this software was soon being used by thousands of biologists all over the world. But the kind of production in which he was engaged was not the kind suggested by the factory metaphor. The company he worked for was a pharmaceutical firm, not a software firm, and Mullis was clearly not a production worker. He did not tend a vat that was brewing a batch of drug or feed a machine that stamped out pills. He was not even one of the people who were discovering and testing the software (the chemical formulae for new drugs) that production workers on a production line would use. That is, he did not synthesise test quantities of a new drug or supervise clinical testing of its effects.

What Mullis did was write software that other people could use when they tried to create software that production workers would ultimately use. At a software company like Microsoft, the analogous activity would be to write software-based programming tools that the people who write the code for software applications could use. In the end, production workers would then use this code to make the good that Microsoft actually sells. They would repeatedly copy it onto floppy disks and put the disks in a box together with a manual.

Most of the goods that consumers value are physical objects, not intangibles. The most important use for software is therefore as an input in the production of valuable objects. Software is something that managers must manage, just as they manage the wetware of their workers and the hardware of their factories. Sometimes they buy it. Sometimes they coordinate the internal production of it. Sometimes they create it themselves.

At the deepest level, the most misleading aspect of the factory model of economic activity is the suggestion that all of the instructions — all of the software — in any production activity can be discovered and perfected in the beginning. The same power of multiplication that makes PCR valuable means that there will always be room for refinement in our software.

To see just how much scope there is for the discovery of new software, think first of literal software stored on a computer disk. Each position or bit in a long string of digits can take on two values: 0 or 1. If the disk had room for just two bits, then it can hold 2 x 2 = 4 different computer programs: {0,0}, {0,1}, {1,0} and {1,1} If the computer has room for 10 bits it can store 2 times itself 10 times or 1,024 different programs. The possibilities here grow just as the DNA grows in PCR. Twenty bits can store about 1 million different sequences of zeros and ones, 30 bits about 1 billion and so on.

A typical computer hard disk has room for millions of bits, so the number of possible computer programs (that is, different bit strings) that could be installed on a desktop computer is too large to comprehend. For example, the number of distinct software programs (sequences of zeros and ones) that can be put on a one-gigabyte hard disk is roughly one followed by 2.7 billion zeros. This number of possible bit strings is very large relative to the physical quantities in our universe. A rough guess is that the total number of seconds that have elapsed since the big bang created the universe is about 1 followed by 17 zeros. Another rough guess is that the total number of atoms in the universe is equal to about 1 followed by 100 zeros.

The number of possibilities can be very large, even for very simple hardware systems. Suppose that a worker must assemble 20 numbered parts to make a final good. The software for this system is merely a description of the sequence the worker should use for assembling the parts. The worker could start with part number one, go on to connect part number two, then part number three and so on, proceeding in sequence. Or she could start with part number 13, connect part number 11, then part number 17 and so on. The number of different assembly sequences is equal to about 1 followed by 18 zeros, more than the number of seconds since the Big Bang.

In any actual assembly operations that can involve thousands of parts, there are many assembly sequences that have never been tried. Many of them will be much worse than the sequence currently being used. But almost surely there are others that would generate important efficiency gains. The best managers understand this potential for improvement and encourage workers on the assembly line to consider alternative assembly sequences, to experiment and to communicate their successes to others. In effect, they have made their production workers into knowledge workers as well.

Computers illustrate the potential and the peril that lies in the search for better software. The overwhelming majority of randomly generated bit strings that could be stored on a disk will generate useless garbage when the computer first starts up. Very similar pieces of software can have very different values. A carefully crafted piece of software can do wonderful things, but if even one bit in the program is changed its functionality can be destroyed.

The complexity of separating good from bad in unexplored software territory tends to encourage many managers to leave well enough alone. They do not encourage experimentation and they sometimes fail to pursue promising paths if they lead into unfamiliar terrain. But the big gains for society and big profits for firms will come from innovations in software. This is the lesson that has emerged so clearly from the experience of the computer industry. Firms like Microsoft and Intel earn the bulk of profits in their industries because they control the best software.


The best managers foster software innovation

Contrast their experience with that of photographic film giant Kodak, which tried to enter the business of making and selling floppy disks. No matter how talented its marketers, strategic planners and production workers, there was no way for Kodak to make money in this rapidly growing market because it did not control any proprietary software that it could use over and over again. The situation had been very different in its core area of photography, where Kodak’s software for making colour film had been protected by patents.

The point is equally true in other areas. For example WalMart is the most successful retail chain in the world because it has the best software for running a discount retail store and it uses that software over and over again in stores throughout North America and perhaps eventually in the rest of the world.

So the best managers at the best companies foster software innovation at all levels, from the assembly line to the research lab. They understand the problems and develop testing systems that keep the downside risks under control. And when they see software that is new and better, they make sure that gets used over and over again and that their firm gets a small share of the benefits each time it is used.

Soon after Mullis made his discovery, PCR was recognised as a research tool, a diagnostic tool and a forensic tool. But it was not a pharmaceutical product and Cetus was in the pharmaceutical business. As in most start-up firms, financial and scientific resources were scarce and senior managers did not want to divert them to an area that was outside Cetus’s core business. Fortunately for the shareholders of Cetus, there was one inspired manager who saw the potential and protected Mullis and his discovery. He created joint ventures with outside partners to develop PCR-based products. Ultimately, his efforts led to the sale of Cetus’s rights in PCR to Hoffman-La Roche for $300 million. Had it not been for this manager, PCR might have been just another story of how a major technological breakthrough was neglected by the company that sponsored the original research and ended up being commercialised by some other firm.

The firms controlled by managers who understand what is at stake are being transformed to take advantage of the potential for discovering valuable kinds of software. At these firms, a relatively small fraction of the workforce is engaged in pure production, copying floppy disks or loading boxes of pills onto trucks. Most workers are knowledge workers engaged in discovering, testing and refining software. These are the activities that will lead to the biggest gains for business and for society as a whole. But the firms controlled by managers who do not understand the deeper issues will end up like Kodak, living off of slowly eroding profits from an earlier generation of software, wondering why the distinctive black and yellow packaging that seemed to work so well in the film business did not lead to profits in the computer industry.