The Competitiveness of Nations

in a Global Knowledge-Based Economy

April 2003

AAP Homepage

Nathan Rosenberg

Exploring the Black Box: Technology, economics and history

Chapter 13 Scientific instrumentation and university research

Cambridge University Press

Cambridge, U.K. 1994

250-263

Index

Introduction

The importance of scientific instruments

The diffusion and impact of scientific instrumentation

Diffusion across disciplines

Diffusion from the academic laboratory to industry

Diffusion from industry to the wider research community

The role of instrumentation in shaping science and technology

Conclusions

Introduction

The purpose of this chapter is to examine certain roles played by American research universities in the development of an important category of technology: scientific instruments.  In the years since the Second World War the research universities performed much more complex functions than can be summarized in the statement that they served as the main centers for the performance of basic research, although that is obviously fundamental. [1]  In addition, within the university context, and in connection with the performance of basic research, there took place a complex interplay between scientific and technological forces that led to other potentially significant outcomes.  Obviously, the immediate increments to knowledge resulting from basic research itself are, sometimes, of the greatest economic significance.  However, I will suggest that there have been paths of influence and causation that have not yet been systematically identified or examined, much less measured.  I will further suggest that the emergence and diffusion of new technologies of instrumentation (as well as new research methodologies) are central and neglected consequences of university basic research.  As a result, the eventual economic impact of basic research, taking place in a

This chapter is reprinted, with small changes, from Research Policy, 21(1992), pp. 381-390. I acknowledge the great benefit derived from conversations with Harvey Brooks.  Valuable comments were also received from Marvin Chodorow, Sir Aaron Klug, W.E. Steinmueller, and an anonymous referee of Research Policy.  The financial support for the research on which this chapter is based was provided by the Technology and Economic Growth Program of the Stanford University Center for Economic Policy Research.

1. For example, although it will not be discussed, the research universities also performed a great deal of applied research across the whole range of engineering disciplines, as well as in metallurgy and materials science, in medicine and pharmacy, and in agriculture.  It may be added that the overemphasis upon the contributions of American universities should be attributed, not to chauvinism, but to the author’s comparative ignorance of developments elsewhere.

250

particular academic discipline, has commonly expressed itself through the medium of new instrumentation technologies and the subsequent life histories of these new technologies.  This chapter attempts to provide some preliminary mapping of such lines of influence.

What follows, then, is obviously exploratory and not definitive.  Nevertheless, if its central conclusion is correct, this chapter points to the importance of a more thorough examination of the role played by university research as the source of a highly influential category of modern technology: instruments of observation and measurement.  Moreover if this role is eventually judged to be highly significant, it would appear that the economic benefits of university research are being substantially underestimated.

 Index

The importance of scientific instruments

Scientific instruments may be usefully regarded as the capital goods of the research industry.  That is to say, the conduct of scientific research generally requires some antecedent investment in specific equipment for purposes of enhancing the ability to observe and measure specific categories of natural phenomena.  Moreover, much of the scientific instrumentation that is now in existence had its historical origins in the conduct of basic research - specifically, in the attempt to advance the frontier of scientific knowledge through an expansion in observational or experimental capabilities.  In this sense, a central part of the “output” of the university research enterprise has been much more than just new theories explaining some aspects of the structure of the universe, or additional data confirming or modifying existing theories.  A further output (or by-product) has been more powerful and versatile techniques of instrumentation including, in many cases, the ability to observe or measure phenomena that were previously not observable or measurable at all.  New instrumentation has thus often been an unintentional and, to a surprising extent, even an unacknowledged, product of university research.

A common denominator among a wide range of scientific instruments is that they were initially designed in response to some very specific, narrowly defined requirement of research in a particular discipline.  However, after their successful development, it became apparent that the instrument had useful applications in some other scientific realm - whether basic or applied - often requiring substantial modification or redesign.  The analogy with more conventional capital goods should be apparent here.  Machine tools originally designed to meet the specific requirements of textile or locomotive or musket manufacturers were later transferred to manufacturers of

251 Index

sewing machines, bicycles, typewriters, and automobiles.  Such transfers have been numerous and diverse. [2]  Similarly, scientific instruments designed to improve technical capability or to solve one set of research problems have often turned out to have applications in disciplines and technology sectors far from those where they originated.

The most spectacular of such transfers has involved the computer.  Computers are, of course, the scientific instrument par excellence; their origins can be traced to research conducted in several countries, although the research context from which they originally sprang is now largely forgotten.  In the past thirty years, computers have become indispensable wherever extensive calculations are made - which is to say everywhere in the scientific world.  The demand for greater calculating capability turned out to be enormous when the cost of computing was reduced by many orders of magnitude.  The computer has made possible many kinds of research activities that would have been simply impossible if computational costs and capabilities had remained frozen at the levels which prevailed at the outbreak of the Second World War.  Moreover, much of the progress in research capability in the past couple of decades has occurred by linking other new scientific instruments to the computer.  This includes computer control of a wide range of experiments that could hardly have been undertaken in its absence.  In addition, the availability of powerful computers has opened up the possibility of large-scale simulation of physical and biological processes.

At the same time, the computer has spread into uses in business, government, medical care, and private households which are extremely remote from its scientific points of origin, and certainly very far from the specific purposes that dominated the thinking of the pioneers of computing.  A quick stroll, for example, through the intensive care unit of any major hospital will disclose a number of essential technologies that are directly dependent upon the computer for the continuous monitoring of vital signs: blood pressure, respiratory rate, pulse rate, and cardiac rhythm.

A common denominator among many of the pioneers in developing the computer - Howard Aiken at Harvard, John Atanasoff at Iowa State University, Konrad Zuse in the German aircraft industry, and John P. Eckert, Jr. and John W. Mauchly at the University of Pennsylvania - is that their contributions resulted from the fact that they were confronted by extremely tedious and time-consuming computational requirements in their research work, typically involving solutions to large systems of

2. See Nathan Rosenberg, Technological Change in the Machine Tool Industry, I 840-1910, Journal of Economic History (December 1963).  Reprinted as chapter 1 in Nathan Rosenberg, Perspectives on Technology, Cambridge University Press, Cambridge, 1976.

252

differential equations. [3]  Interest in useful applications of this capability outside the sphere of research (including military R&D during the Second World War) was, for a long time, limited or non-existent. [4]

 Index

The diffusion and impact of scientific instrumentation

The computer has been, of course, strictly sui generis.  No other scientific instrument has had anything like its immensely diverse range of applications.  Nevertheless, a detailed history of the development of instrumentation in the twentieth century would probably reveal an inventive process similarly dominated by the requirements of academic research.  The subsequent diffusion paths of this instrumentation have been highly complex, but there are three aspects that need to be stressed.

 

Diffusion across disciplines

Instrumentation and techniques have moved from one scientific discipline to another in ways that have been very consequential for the progress of science.  In fact, it can be argued that a serious understanding of the progress of individual disciplines is often unattainable in the absence of an examination of how different areas of science have influenced one another.  Moreover, this understanding is frequently tied closely to the development, the timing, and the mode of transfer of scientific instruments among disciplines.  The flow appears to have been particularly heavy from physics to chemistry, as well as from both physics and chemistry to biology, to clinical medicine, and ultimately, to health-care delivery. [5]  There has also been a significant flow from chemistry to physics, and in recent years from applied physics and electrical engineering to health care.  The transistor revolution was a direct outgrowth of the expansion of solid-state physics, but the success of that revolution was in turn heavily dependent upon further developments in chemistry and metallurgy which made available

3. See David Ritchie, The Computer Pioneers, Simon & Schuster, New York, 1986.

4. For a further discussion of the inability to foresee the economic consequences of the computer see chapter 11.  See also Paul Ceruzzi, An Unforeseen Revolution: Computers and Expectations, 1935-1985, in Joseph J. Corn (ed), Imagining Tomorrow, MIT Press, Cambridge (MA), 1986, pp. 188-201.

5. The National Research Council Physics Survey Committee noted that “Many physical techniques have become so fully integrated into biological research that their origin in physics is forgotten until some underlying physical advance in the method provides a reminder; recent examples include various spectroscopies, electron microscopy, X-ray crystallography, and nuclear resonance.”  Scientific Interfaces and Technological Applications, Physics Through the 1990s, National Academy Press, Washington (DC), 1986, pp. 27-28.

253 Index

materials of a sufficiently high degree of purity and crystallinity.  It would be most interesting to understand better than we do at present why the traffic is so heavy in some disciplinary directions but so light in others. [6]

One relevant point, however, is clear.  The availability of new or improved instrumentation or experimental techniques in one academic discipline has been a frequent cause of interdisciplinary collaboration.  In some cases, it has involved the migration of scientists from one field to another, such as those physicists from the Cavendish Laboratories in Cambridge who played a major role in the emergence of molecular biology.  This amounted to interdisciplinary research in the special sense that individuals trained in one discipline crossed traditional scientific boundary lines and brought the intellectual tools, concepts, and experimental methods of their field to the assistance of another.  There have been a number of other instances where the availability of novel instrumentation has been crucial to the establishment of new disciplines, as in the cases of geophysics, computational physics, and artificial intelligence.

The story of the migration of scientific instruments from their points of origin to their utilization in other disciplines is an underresearched topic meriting several monographs, at the very least.  It is interesting to note that much of the transfer from physics to other disciplines, as already suggested, has involved the migration of labor as well as capital, for example, PhD’s in physics have changed or transferred fields in greater numbers than PhD’s in other disciplines.  This point was emphasized by the United States National Research Council’s Physics Survey Committee which reported in 1986:

Much of the outward mobility of physics PhD’s has been into engineering and interdisciplinary areas such as geophysics, materials research, and biophysics; but PhD physicists also work in areas ranging from chemistry to the biosciences.  Some of this mobility occurred within academe where physicists teach and conduct basic research in related science and engineering departments.  Most of it, however, occurred in the industrial sphere where applications of physics research move easily across disciplinary barriers. [7]

The transfer of scientific instrumentation from one field to another has been an intrinsic part of the history of scientific research for several decades.  The electron microscope, for which a Nobel Prize in Physics was awarded several years ago, was rapidly adopted throughout the entire range of the biological as well as the physical sciences.  Particle accelerators, which were originally devised to examine the structure of the atomic nucleus, eventually exercised a major impact on medical research and treatment through

6. Ibid., p. 54.

7. An Overview, Physics Through the 1990s, National Academy Press, Washington (DC), 1986, p. 99.

254

their role in producing radioisotopes.  Isotope tracer techniques have been of fundamental importance in both medical diagnostics and biological research.  Nuclear magnetic resonance (NMR) is a classic instance of a tool of pure science developed by physicists at Harvard and Stanford Universities, in order to measure the magnetic moments of atomic nuclei - an innovation for which, again, the creators received the Nobel Prize in Physics.  The technique quickly became a fundamental tool in analytical chemistry.  More recently, the technology has been transferred to the biological sciences and the realm of medicine, where magnetic resonance imaging has become invaluable in clinical diagnosis:

Whole-body scanning by NMR provides sectional images of the human body of remarkable clarity and with none of the potential hazards of X-ray scanning.  There is now discussion that NMR may one day allow doctors to observe human metabolism without surgical procedure, moving us one step closer to the possibility of knifeless biopsy. [8]

NMR is far from unique as a technique that originated purely as a scientific research tool and was subsequently introduced into medical diagnostics.  Computerized X-ray transmission tomography (CT), which was developed in the 1970s (primarily in the United Kingdom), is widely regarded as the most significant single step forward in medical imaging during the twentieth century.  Together with ultrasonics, widely used by cardiologists and obstetricians, there is now an impressive array of relatively non-invasive diagnostic technologies (high-frequency ultrasound is similarly acquiring an important therapeutic application in the fragmentation of kidney stones through the use of lithotripters).  The National Research Council Physics Survey Committee was thus able to conclude that:

The record clearly shows that most innovation in medical instrumentation since the turn of the century, even that of the past few decades, has come from the universities and medical schools and not from the medical-device industry. [9]

 Index

Diffusion from the academic laboratory to industry

The transfer of instrumentation from one field of science to another, or from basic to applied problems, is only a part of the story of the eventual impact of instruments originating in university laboratories.  Instrumentation developed by academic scientists has, in the post Second World War years, also moved in massive amounts into many areas of industrial technology.  Indeed, much of the equipment, perhaps most, that one sees

8. Scientific Interfaces, Physics Through the 1990s, p. 86.

9. An Overview, Physics Through the 1990s, p. 256.

255 Index

today in an up-to-date electronics manufacturing plant had its origin in the university research laboratory.  In this sense, scientific instruments are now effectively indistinguishable from industrial capital goods.  Consider the following:

a. Ion implantation originated as a technique of basic scientific research in the field of high-energy particle physics.  Its origin lay in the early work in particle physics which flowed from the recognition that magnetic and electric fields could be used to impart energy to particles.  Methods of charging, accelerating, and directing these ion beams were developed in order to elucidate theories of physics.  As the frontier of very large-scale integration created a need for controlling the deposition of impurities on semiconductor devices with ever-higher degrees of precision, ion-implantation techniques were transferred to the semiconductor industry.  It now constitutes the preferred technique of deposition in integrated circuit technology. [10]

b. It is conceivable that the transfer of ion-implantation techniques from the research laboratory to the semiconductor industry may be partially duplicated with the use of synchrotron radiation sources, which already offer several potentially useful techniques for improving the manufacture of integrated circuits.  In the late l970s, synchrotron radiation moved from merely being an annoying side effect in experimental high-energy physics to assume a more positive role in condensed-matter physics and biology.  As the current methods of X-ray lithography approach their limits in the realm of submicron lithography, the instrumentation of synchrotron radiation is becoming directly applicable to the manufacturing requirements of integrated circuits.  This could have significant consequences for international competition in electronics.  As matters now stand, Japanese firms, organized in consortia, have already moved vigorously into this new technology, with more than ten synchrotron storage rings under development for use in manufacturing integrated circuits.  On the other hand, although the United States has many such rings for research purposes, IBM is the only American firm that is currently building a synchrotron storage ring for commercial use.  This venture into X-ray radiation sources represents a high-risk activity.  Not only is IBM’s emerging technology extremely complex and expensive, but alternative and cheaper circuit-etching technologies may be available by the mid l990s, when IBM’s new method is expected to become sufficiently mature to enter production.

c. The most important recent advance in semiconductor processing is phase-shifted lithography.  This technique is an application of interferometry that allows higher resolution by interacting two beams of

10. Scientific Interfaces, Physics Through the 1990s, chapter 8.

256

monochromatic light in order to produce precise patterns on a chip.  Although this particular application is new, interferometers have been an important scientific instrument since the early part of the twentieth century.  In fact, the interferometer had been invented in the 1880s by A.A. Michelson, America’s first Nobel Prize winner, in order to test the classical Newtonian concept of absolute motion.

d. The scanning electron microscope, a scientific research tool of great sophistication, has migrated from its university origins, initially as a research tool at Cambridge University, to the world of manufacturing technology.  It has become an indispensable measurement tool in microelectronics fabrication, where the elements of memory chips are now at a scale that are too small to be resolved with optical microscopes.  The semiconductor industry is hardly unique in its experience of transferring research instrumentation, as opposed to transferring knowledge derived from research, from the university laboratory to the factory floor.  Similar statements could be made about the advanced technology of industrial process control, robotic sensing, and a variety of other more specialized instrumentation applications.  Also in a similar category are: the diffusion of techniques for the production, measurement, and maintenance of high vacuums in larger and larger volumes; the transfer of cryogenic techniques from laboratory to large-scale industrial use (as in booster rockets); and industrial-scale superconducting magnets which had their origins in experimental physics.  The common denominator running through and connecting all these experiences is that instrumentation developed in the pursuit of scientific knowledge eventually had direct applications within the manufacturing process.  Consequently, they constitute benefits of basic research activity which are separate and distinct from those flowing from pure scientific knowledge and the eventual applications of that knowledge.

 Index

Diffusion from industry to the wider research community

There is a further dimension to the connection between laboratory instrumentation and commercialization which deserves recognition.  Many instrumentation technologies originating in university laboratories have eventually been taken up and exploited by profit-making firms.  This has typically resulted in standardized off-the-shelf equipment which provides improved performance and versatility at a much-reduced cost.  The result is that the instrumentation diffuses rapidly throughout both industry and the wider university research community.  This process has vastly expanded the size of the industrial and research populations to which the instrumentation was accessible.

One essential aspect of this expansion in use has been modification of

257 Index

design so that instruments can be employed by people with lower levels of training.  Often, in fact, it has proven worthwhile to redesign to lower performance ceilings in order to permit the substitution of automatic control for control by a highly trained operator. [11]  Thus, the ultimate benefits have flowed not only to the industrial world, but in some considerable measure back to a much larger scientific research community whose members have been provided with greater access to necessary instrumentation.

In this respect, an important and insufficiently appreciated aspect of the high level of performance of American science has been the emergence of a strong scientific instruments industry in the United States.  The entrepreneurial efforts of this industry, including the fruitfulness of its interactions with university researchers, who were frequently both the designers and the users of the innovation, have been of immense value to the scientific community. [12]  Firms have had a strong incentive to find new markets for existing instruments and thereby to expand the population of users, who often turned out to be other scientists.

The benefits resulting from the successful commercialization of new scientific technologies are more than a matter of individual instruments.  Rather, the migration of scientific instruments to industry has been matched by a reverse flow of fabrication and design skills that have vastly expanded the capacity of university scientists to conduct research.  This is perhaps most apparent in the ways in which micro-fabrication technologies have made possible the conduct of new fundamental research in fields such as condensed-matter physics:

The ability to produce structures on a nanometer scale has facilitated recent investigations into such areas as conduction electron localization, non-equilibrium superconductivity, and ballistic electron motions.  Microscience is becoming an area of increasing activity in solid-state research laboratories.  Indeed, one of the major reasons for which the National Science Foundation established a National Submicron Facility in the late 1970s was to help to make this impressive microfabrication technology available to scientists for fundamental research. [13]

In short, the interplay between universities and private industry in the development of new and improved techniques of instrumentation has clearly been and will probably continue to be a symbiotic one.

11. A high-resolution electron microscope of Japanese design was installed at Cambridge University in 1990.  The earlier high-resolution electron microscope, built by the Cambridge engineering and physics faculty in the 1970s, remained capable of attaining higher levels of performance than its automatically controlled successor, but only in the hands of a skilled faculty member.  (I am grateful to Dr. W.C. Nixon of Peterhouse College, Cambridge, for his guided tour of this facility and his patient explanation.)

12. Eric von Hippel has paid particular attention to the dominant role played by users in the scientific instruments industry.  See Eric von Hippel, “Users as Innovators,” chapter 2, The Sources of Innovation, MIT Press, Cambridge (MA), 1987.

13. Scientific Interfaces, Physics Through the 1990s, p. 141.

258

Index

The role of instrumentation in shaping science and technology

So far, the discussion has focused on ways in which novel instrumentation has been initiated and generated by the requirements of basic research.  The main avenues along which the influence of new instruments has been diffused have also been identified.  It is now appropriate to call attention to another set of influences that run from technology “upstream” to basic science, and which similarly have been badly neglected.

Two key points need to be made.  First, a new instrument, once available, usually requires further development, including sometimes basic research, in order to improve its performance.  Second, these new lines of research, triggered initially by the needs of instrumentation, often subsequently acquire a dynamic and significance of their own.

Examples here are: the computer in its 1946 form, as the ENIAC at the University of Pennsylvania; the first transistor in its 1947 form at Bell Labs; and the first ruby laser in 1960.  In each case, the new technology had very poor performance characteristics.  The ENIAC was a gigantic and clumsy apparatus, more than 100 feet long, with approximately 18,000 vacuum tubes that consumed over 100 kW of electricity.  The transistor which, among other things, eventually transformed the computer by eliminating its dependence upon vacuum tubes, was itself initially unreliable and sometimes behaved in unpredictable ways.  The laser was, even until quite late, regarded more as a scientific curiosity than as a technological innovation.  The patent attorneys at Bell Labs were at first reluctant even to apply for a patent on the grounds that there was no apparent application in the communications industry.

A common feature of new instruments, then, is that their initial performance levels are poor and/or unpredictable.  They may also require components or materials which possess characteristics not presently available or available only at very high cost.  Sometimes, their apparent potential can be realized only if a particular scientific or technical bottleneck is overcome.  Inmost cases, the availability of a new instrument (or technique) has therefore given rise to intense research activity stimulated by the need to improve its performance, to develop some ancillary technology, or to identify a cheaper or more reliable material base.  The transistor coupled with the evident potential of semiconductors resulted in an explosion of research in solid-state physics and the physics of surface phenomena in the late 1940s and 1950s.  The number of basic publications in semiconductor physics rose from less than twenty-five per annum before 1948 to over 600 per annum by the mid 1950s. [14]  In the early 1950s, as the transistor experienced a widening range of applications, serious reliability problems

14. C. Herring, “The Significance of the Transistor Discovery for Physics,” Bell Telephone Laboratories, unpublished manuscript, no date.

259 Index

emerged.  The defects were eventually traced to surface phenomena and, consequently, a great deal of basic research needed to be undertaken.  In the end, the effort to solve these reliability problems in the performance of transistor components led to much fundamental new knowledge in the area of surface physics.

The development of the laser suggested, among other things, the possibility of using optical fibres for transmission purposes.  This resulted in a burgeoning of research in the field of optics, a scientific subdiscipline which had been a relatively quiet intellectual backwater until that time.  The growth of activity in the discipline was thus generated, not by forces internal to the field of optics, but by a radically altered assessment of the potential opportunities for laser-based technologies.  Moreover, different kinds of lasers gave rise to different categories of fundamental research.  As Brooks has noted, “While the solid-state laser gave a new lease on life to the study of insulators and of the optical properties of solids, the gas laser resuscitated the moribund subject of atomic spectroscopy and gas-discharge physics.” [15]

The conclusion which can be drawn is that, in the years since the Second World War, a succession of new technological capabilities in instrumentation has played a major role in shaping the agenda of research in universities and elsewhere.  These connections have not been well recognized, in part for reasons that are inherent in the nature of scientific research.  Questions which are initially raised by some particular observation or performance anomaly in a special context have a way of raising new questions of much greater generality.  Further questions or implications are eventually raised as a result of findings of further research and, consequently, still further questions of a more fundamental nature are posed.  In a very serious sense, the new questions take on a life of their own as they are pursued far beyond the requirements of the technologies that initially gave rise to them.  Thus, the need for highly perfect crystals in semiconductor technology produced an immense stimulus to classical crystal physics and chemistry.  Although Shockley had been very interested in dislocations in the late 1940s, the great expansion in such interest and the emergence of a science of imperfections in crystals in the 1950s owed very much to the growing needs of semiconductors. [16]  Moreover, the working materials in the early experiments tended to be silicon and germanium

15. Harvey Brooks, Physics and the Polity, Science (26 April, 1968), p. 399.

16. Shockley was one of the editors of the volume, Imperfections in Nearly Perfect Crystals, financed by the Office of Naval Research.  This book was a landmark in the emergence of the new discipline of imperfections.  Although published in 1952, its contents were based on a symposium conducted in October 1950.  See W.B. Shockley, J.H. Holloman, R. Maurer, and F. Seitz (eds.), Imperfections in Nearly Perfect Crystals, John Wiley, New York, 1952.

260

simply because industrial requirements had already led to methods of crystal growth and purification for these materials that were far more advanced than for other substances.  The semiconductors also turned out to be excellent materials for observing individual dislocations and their electronic effects.  Ultimately, scientific study which had been powerfully stimulated by the attempt to improve the performance of transistors in a variety of electronic devices led to a new approach to the subject of dislocations, emerging eventually as a theory of great power and generality, in no way restricted to the concern with transistor effects or the class of semiconductor materials that gave rise to the research in the first place.

The instrumentation requirements of university research have thus had consequences far beyond those that are indicated by thinking of them simply as an expanding class of devices that are useful for observation and measurement.  Furthermore, they have played more pervasive, if less visible roles which include making a direct impact upon industrial capabilities, on the one hand, and stimulating more fundamental research, on the other.  This even includes a role of great importance in redefining and expanding the agenda of fundamental university research in both scientific and technical disciplines.

It is possible to go a step further.  It follows from what has been said that the rate of progress, and the timing of progress, in individual scientific disciplines may be shaped, to a considerable degree, by the transfer of instruments, experimental techniques, and concepts from one scientific discipline to another.  But the timing of these transfers, and the circumstances that are conducive to them, have not yet been studied, as far as I know, in a very systematic way, and are not, as a result, very well understood.  It is therefore possible that more research along these lines may powerfully illuminate the course of scientific progress in the twentieth century. And, needless to say, the scope of such research must be international.

 Index

Conclusions

It seems natural at this point to pose the question: what were the consequences of the role played by the research university in the development of scientific instruments, as it has been characterized here, upon the operation of the economy?  Such a question necessarily poses the counterfactual: how would the performance of the economy have differed in the absence of the university’s research capability?

One possible response is to conclude that all the instrumentation technologies would eventually have been developed anyway, but that they would have taken longer to develop.  The economic contribution of the

261 Index

research community is therefore to be measured by how much sooner those capabilities were acquired as a result of university research, and what the economic value was to society of having each capability X years sooner.

An alternative and less facile response would be that the presence of the university research capability shaped not only the rate of technological change but also its direction and therefore its qualitative outcome as well.  I lean strongly toward this latter response.  I have already indicated some of the ways in which a powerful university research community has altered the shapes of these instruments and influenced the ways in which they were utilized.  America’s distinctive leadership in the experimental, as compared to the purely theoretical, sciences in the post Second World War years was surely closely connected to the country’s outstanding instrumentation capabilities.  In addition, however, the presence of this community has meant that new instruments have not merely improved the effectiveness of existing research at basic and applied levels.  Rather, they have also been responsible for formulating new questions at the level of fundamental and applied research that would otherwise not have been posed or explored.  In the field of medicine, where it is frequently observed that diagnostic capabilities have outrun the possibilities of therapeutic intervention, it is almost certainly true that improved diagnostic capabilities have exercised a powerful influence upon the search for more effective therapies and have also posed further research questions of a fundamental nature. [17]

It is far from obvious how one should go about dealing with the counterfactual world that these observations imply.  The university context in which much scientific instrumentation originated also provided a high degree of resonance and amplification for these innovations.  Had they originated or experienced their development in a purely commercial context, it is doubtful that the environment would have provided the great stimulus to further research, and to the opening up of entirely new research fields, which actually occurred.  But, since so much new instrumentation arose precisely because university researchers were allowed to pursue

17. “While the modern imaging modalities afforded by advances in physics have contributed significantly to diagnostic accuracy and to the monitoring of the condition and comfort of patients during the diagnostic phase in a cost-effective manner, there is a question concerning the effect of these advanced-technology diagnostic methods on outcome.  Diagnostic capabilities in the areas of cancers, cardiovascular disease, and metabolic diseases appear to have outstripped therapeutic capabilities.  However, the same sophisticated new diagnostic tools afford the means to follow and evaluate therapeutic modalities.  Thus, the rapid advances in noninvasive diagnostic methods of the past decade are showing signs of bringing advances in therapy in the next.”  Scientific Interfaces, Physics Through the 1990s, pp. 254-255.  For further discussion of the subtleties of the interactions between diagnostic capability and therapeutic intervention, as well as the more general question of the nature of the interactions between basic and applied research, see J.H. Comroe, Jr. and R.D. Dripps, Scientific Basis for the Support of Biomedical Science, Science (9 April 1976).

262

fundamental questions that offered no apparent prospects of financial payoffs, it is difficult to take seriously a counterfactual that suggests that the same instrumentation would eventually have been developed in a purely commercial context.

Thus, the deeper counterfactual is not how much later these same instruments would have emerged had they been developed entirely by private industry.  The deeper counterfactual is how the university origin influenced the features that were given prominence and those that were suppressed.  Ultimately, one has to ask the question whether certain instrumentation would have been developed at all.

But there are two final and different counterfactuals that one might pose as well: how much would the basic research thrust of the university science community have been impoverished if it had been deprived, not just of the scientific instruments that have been referred to in this chapter, but of the stimulus to further research that was provided by the attempt to improve the performance of these instruments, once they appeared in their earliest, primitive forms?  And finally, in view of the various impacts, upon the larger economy, of instrumentation that originated in the university context, what has been the social rate of return to society’s investment in such instrumentation?  Although there have been readily identifiable forces that have powerfully influenced the demand for scientific instruments - for example, the requirements of the military and the needs of the health-care system - another highly influential component of demand has been the requirements of scientific research, as conducted within the university community.  Moreover, it is also suggested that this scientific research community undertook radical innovative initiatives that led, in many cases, to the eventual supplying of its own internal demand and, in the process, provided large external benefits as well.

263

Index

The Competitiveness of Nations

in a Global Knowledge-Based Economy

April 2003

AAP Homepage