Speech by Richard C. Levin. THE AMERICAN UNIVERSITY AS AN ENGINE OF ECONOMIC GROWTH
I want to thank Ronnie Chan for inviting me to speak to you today. I am honored to be here and pleased to contribute in whatever way I can to strengthening the many ties between Hong Kong and the educational institutions of the United States. Yale is especially proud of its longstanding, historic involvement in China, as well as its many, more recent connections to Hong Kong’s universities.
The Asian financial crisis is only now beginning to affect the United States. In August, our stock exchanges experienced the largest monthly decline since October 1987. And last week, as the Federal Reserve lowered its discount rate, Chairman Alan Greenspan admitted: “it is just not credible that the United States can remain an oasis of prosperity unaffected by a world that is experiencing greatly increased stress.” During the first half of this year, America’s export of goods and services fell at an annual rate of 5% in inflation-adjusted terms, the nation’s worst performance since 1982. The prospect of declining economic growth or recession is increasingly real. J.P. Morgan predicts a 1999 growth rate of just 1.5%, while Goldman Sachs estimates a 40% chance of recession.
Despite today’s fears of global meltdown, we must not be short-sighted and fail to encourage the forces that create and sustain economic growth in the long term, both in the United States and around the world. Here in Asia, this means reforming banking practices and corporate governance to prevent a recurrence of last year’s financial crisis. In the United States, it means, at least in part, understanding what has fueled the success of the American economy in the post-World War II era, and, more, specifically, what has laid the foundation for sixteen years of uninterrupted economic growth. This brings me to the topic I’d like to discuss with you today PP one of the fundamental sources of American competitive success and global economic growth, an engine of technological advance and innovation, the training ground of tomorrow’s leaders PP the American university.
Today many in America question the value of our universities’ contribution to the broader society. They complain, for example, that the cost of a college education is too high, that professors spend too little time teaching, that colleges fail to prepare graduates to do anything useful, and that universities should be doing more for the communities that surround them. Though there are some grounds for concern, what is frequently overlooked is the fundamental contribution that American universities make to the robustness of the national, indeed to the global, economy. Our universities are an essential source of America’s competitiveness and, ultimately a well-spring of worldwide growth and prosperity.
My colleague, Gerhard Casper, the President of Stanford University, recently claimed that the two central characteristics of the American research university are the search for knowledge and the spirit of critical inquiry.1 It is no surprise that we agree on this point: we both hold graduate degrees from Yale. But more fundamentally, these qualities – the search for knowledge and the spirit of critical inquiry – motivate the university in its two principal activities, research and teaching. Both are essential components of the university’s unique and valuable contributions to economic growth.
Thus, I would like to develop a two-fold argument. First, I want to illustrate that the way research is funded and organized in the United States makes our universities the principal worldwide source of new scientific discovery, and hence, ultimately, the source of technological advance and economic innovation. Second, I will claim that the spirit of critical inquiry and the pedagogical methods that prevail in leading American universities and colleges are also powerful engines of creative leadership, in industry and commerce as well as in science and technology.
II. The Contribution of University Research to Economic Growth
A dozen years ago, when U.S. trade deficits had first reached the level of $100 billion annually and many were questioning the long-term competitive viability of the nation’s industries, I offered a seminar for Yale College seniors entitled “The International Competitiveness of U.S. Manufacturing.” I asked each student to choose a particular industry and make a report to the class on all the available indicators of the competitive status of U.S. firms in world markets: sales, employment, productivity growth, market share, exports, imports, and patents obtained, among others. The students were required to collect data for the United States, Germany, and Japan over a time span extending from 1960 to the mid-1980s.
The results were very revealing. The data the students collected indicated that the alleged decline in U.S. global competitiveness was largely concentrated in a handful of industrial sectors. In essence, we had suffered an enormous absolute and comparative decline in the performance of two industries that were the nation’s largest employers in the 1960s, automobiles and steel. But in most other sectors of manufacturing, we were holding our own, and in those sectors with technologies most closely linked to recent advances in scientific knowledge – pharmaceuticals, specialty chemicals, and segments of the electronics industry – America led the world.
Competitive advantage based on the innovative application of new scientific knowledge – this has been the key to American economic success for at least the past quarter century. And the pattern is no different today. America remains the world’s leader in the industries where science-based technologies are changing rapidly – software, communications equipment, and biotechnology. As technologies mature, labor cost, quality control, and other factors become more important in determining competitive success, and the United States tends to lose its comparative advantage. The dynamic sectors of the American economy – where new jobs are created and productivity growth is most rapid – remain those that create innovative new products based on the application of recent scientific knowledge.
As the nation’s principal locus of basic scientific research, our universities play a key role this pattern of economic competitiveness and growth. Basic research, by definition, is motivated purely by curiosity and the quest for knowledge, without a clear, practical, commercial objective. Yet basic research is the source from which all commercially-oriented applied research and development ultimately flows. I say ultimately because it often takes decades before the commercial implications of an important scientific discovery are fully realized. The commercial potential of a particular discovery is often unanticipated, and often extends to many, economically-unrelated industries and applications. In other words, the development of innovative, commercial products that occurs today depends on advances in basic research achieved ten, twenty, or fifty years ago – most often without any idea of the eventual consequences.
In 1997, U.S. academic institutions spent an estimated $23.8 billion on research and development, 12% of all R&D expenditures nationally. Two-thirds of that $23.8 billion was spent on basic research, while 25% went to applied research and only 8% to development. Over the past three decades, academic research has accounted for approximately half of the total basic research undertaken in the United States. To the extent that universities engage in applied research, their biggest contribution has been in long-term applied research, particularly in the health sciences, where the line between basic and applied research has become increasingly blurry.
The universities’ role as America’s primary basic research machine did not come about by accident. A half-century ago, in the aftermath of World War II and as the Cold War was beginning, the U.S. government clearly and self-consciously established an unprecedented and heavily subsidized system of support of scientific research, in the process transforming the nature and scope of the American university. First articulated by Vannevar Bush, President Harry Truman’s science advisor, in a deservedly famous 1946 report entitled Science: The Endless Frontier, this system has three central features, all of which remain largely intact today. First, the federal government shoulders the principal responsibility for the financial support of basic scientific research. Second, universities – rather than government laboratories, non-teaching research institutes, or private industry – are the primary institutions in which this government-funded research is undertaken. And third, although the Federal budgetary process determines the total amount available to support research in the various fields of science, most funds are allocated, not according to commercial or political considerations, but through an intensely competitive process of review conducted by independent scientific experts who judge the quality of proposals according to their scientific merit alone.
At the height of the Cold War, especially in the 1950s and 60s, the government also played a large role in funding much of the nation’s applied research and development, which was then directed to national defense and space exploration. Since investment for these purposes has declined in real terms and now represents a dramatically reduced percentage of national economic activity, the private sector has come to be the primary funding source for applied R&D. Private industry is also the sector in which the predominant share of the nation’s applied research and development is performed, as it was throughout the era of heavy investment in national defense and space exploration.
This system of organizing science has been, on its own terms and from an international comparative perspective, an extraordinary success. There is little doubt that the United States is the world’s leader in basic research. Over the past two decades, the U.S. has been the source of about 35% of all scientific publications worldwide. Since 1975, more than 60% of the world’s Nobel prizes have been awarded to Americans or to foreign nationals working in American universities. It is also clear that publicly funded basic science has been critical to scientific and technological innovation. A recent study prepared for the National Science Foundation found that 73% of the main science papers cited in industrial patents granted in the U.S. were based on research financed by government or nonprofit agencies and carried out in large part in university laboratories.
It is unlikely that this success could be duplicated by industry. The private sector has little incentive to invest in basic research because the returns from the creation of new generic knowledge are difficult to appropriate for private benefit. In contrast, it is much easier to reap the returns from investment in applied research directed toward a specific commercial end, especially if the legal framework governing intellectual property provides effective protection against the imitation of one’s products by rivals.
Moreover, the time lags between the initiation of basic (or even long-term applied) research and commercial application are long, far longer than an impatient private sector could tolerate. Scientists cannot schedule fundamental breakthroughs, and the eventual applications that arise from them may be surprises, both in form and in timing. Ordinarily, the ultimate commercial applications are entirely unforeseen when the initial, enabling discoveries are made in university laboratories. It has been thirty-four years since Watson and Crick discovered the double helix, and the enormous practical benefits of this discovery are only now beginning to be realized through new medical treatments and a whole new technology for developing pharmaceuticals. Today, scientists at Yale, Harvard, Stanford, and elsewhere – building on curiosity-driven research conducted a decade or two ago – are getting closer and closer to finding cures for cancer, which may be still a decade or two away. Universities, in their unending, unadulterated search to know, are uniquely situated to undertake such long-term research without worrying about its commercial application and payoff – a luxury that profit-seeking private industrial firms cannot afford.
Examples of how university-based research has yielded enormous and unanticipated benefits are abundant. Take the laser. In the 1950s, Professor William Bennett began working on the phenomenon of coherent light. After he came to Yale in 1961, he continued his work on lasers with the support of grants from the U.S. Department of Defense. For many years, the laser was what Professor Bennett calls “a solution looking for a problem.” Today there are so many uses for lasers that it would be impossible to describe them all in the time that remains. Lasers are used to cut cloth, to lay out the foundations of a house, to make micro chips, to pinpoint and treat brain tumors without surgery. In fact, when Professor Bennett suffered from a detached retina three years ago, the treatment he received was accomplished by using precisely the same Argon Ion Laser which he developed at Yale in 1964.
Although the eventual consequences of research on lasers were unforeseen, in some cases, the trajectory of a research program may be predictable, but the time lags simply too long to warrant private sector investment in the early stages. The development of a vaccine for Lyme Disease is such an example. Twenty-three years ago, researchers at Yale’s School of Medicine first identified Lyme Disease after mothers in nearby Lyme, Connecticut, insisted that an infectious agent was responsible for the arthritic swelling of their children’s joints. Ever since, Yale has been the world center for Lyme Disease research. By first discovering the mechanism by which the disease infects human cells, Yale researchers eventually, after thirteen years of study, discovered a potential vaccine to prevent the debilitating tick-borne disease. This summer, after ten more years of research and testing, a Food and Drug Administration advisory panel approved the vaccine, called LYMErix, which will soon be marketed by SmithKline Beecham.
Today, in our research universities, there are hundreds if not thousands of currently active long-term projects with great economic potential. Let me cite just a few examples. Syracuse University researchers have spent twenty years working on a “smart” salt-marsh protein (bacteriorhodopsin) with photo-active properties that can be used in optical three-dimensional computer memories. Based on this research, the scientists envision a day when a small rectangular solid with a volume of three cubic centimeters will be able to store up to eight gigabytes of memory. A research team at the University of Chicago Medical Center has identified a small region on the surface of nerve cells that may be essential for the actions of inhaled anesthetics, opening a door to new pain medications. The same site may be responsible for some of the depressive effects of alcohol and could provide crucial insights into the genetics of alcohol addiction. Finally, Cornell University engineers have demonstrated a “universal substrate” on which a crystal of any material can be grown, a technique that could revolutionize the design and production of microelectronic devices. Practical advances based on any of the examples I have cited may be many years away, and, indeed, each one of these might fail. But others will undoubtedly succeed, and it is a virtual certainty that the gains produced by the entire portfolio will more than justify past and present public investment in university-based R&D.
Despite the bang-for-the-buck obtained by academic research, federal funding for most fields of academic research has decreased in real terms, although funding for the health sciences is a notable exception. I am concerned about this trend, but I am hopeful that recent efforts in Congress to increase federal investment in basic research is successful. I am also concerned with intermittent pressures, from Congress and sometimes from the executive branch, to reallocate federal research funding toward “downstream” applied research projects, especially toward collaborative efforts between universities and the private sector. Such programs are often ill-conceived, for two reasons: First, we have a strong track record of success with federal funding of long-term basic research based on peer review, but we have been almost entirely unsuccessful in selecting non-military applied R&D projects for public support. Second, universities and the private sector have learned how to collaborate successfully on applied R&D. An increasing share of research performed in universities is privately funded, and government intervention to select projects relevant to “national needs” is more likely to complicate than improve the quality of our industrial partnerships.
Aside from scientific discoveries and advances, a final, and equally significant, benefit of locating most fundamental scientific research in universities rather than in government laboratories or private research institutes is that the next generation of scientists receives its education and training from the nation’s best scientists, who are required to teach as they pursue their own research. Of course, some of these well-trained graduate students become professors after they complete their degrees and post-doctoral study, thus ensuring that the academic research engine is continually replenished with new, skilled scientists. But the many who enter industrial employment after graduation take with them invaluable assets – state-of-the-art knowledge obtained by working at the frontiers of science and experience with the most advanced research tools and equipment. They also take with them a particular way of thinking, a topic to which I turn next.
III. The Contribution of Liberal Education to Economic Growth
The knowledge created by the enterprise of academic science is by no means the only contribution of American universities to economic growth. By engaging students in intellectual inquiry, making them active participants in the search to know, and fostering their problem-solving abilities, universities and colleges contribute to economic growth through their teaching as well as their research. And it is not only the education of industrial scientists and engineers that has an impact on economic performance, it is the education of all those engaged in the business sector – executives, entrepreneurs, financiers, and consultants alike.
The world we live in is fast-paced and constantly changing. Many successful companies produce products or services based on technology or marketing strategies that didn’t exist a decade or two ago. In such a world, knowledge of a given body of information is not enough to survive, much less thrive; business leaders must have the ability to think critically and creatively, and to draw upon and adapt ideas to new environments.
The methods of undergraduate, as well as much professional, education used by America’s most selective and distinguished universities and liberal arts colleges is particularly well suited to prepare students for a changing world. Unlike British universities, which require students to specialize early, America’s finest research universities and liberal arts colleges are committed to the “liberal education” of undergraduates. In The Idea of a University, Cardinal Newman defined liberal education as an end in itself, independent of practical consequences, directed to no specific purpose other than the free exercise of the mind.2 Liberal education cultivates the intellect and expands the capacity to reason and to empathize. Its object is not to convey any particular content, but to develop certain qualities of mind: the ability to think critically and independently, to be creative and innovative, to liberate oneself from prejudice and superstition, to sift through information, to extract what is useful and, to use Newman’s words, “to discard what is irrelevant.” Just as the largest social benefits derive from scientific research that is undertaken without any focus on a commercially salient objective, so, I would argue, the largest social benefits derive from a pedagogy that seeks to enlarge the power of students to reason and think creatively without focus on mastering a particular body of knowledge.
What does this mean in practical terms? It means that, at America’s best universities and colleges, education is not a one-way street. Information is no longer simply conveyed from faculty to students and regurgitated back on examinations. Consider the following description of Woodrow Wilson’s teaching, from the days when he was a professor at Princeton University. It provides a good example of a style of pedagogy that is now shunned at the best American institutions.
Professor Wilson habitually stood during his lectures. Speaking from a mere skeleton of notes, he hammered in his teachings with an up-and-down, full-armed gesture. Thus he was a perpendicular lecturer, his talking nose and his oscillating Adam’s Apple moving up and down with speech, along with his pump-handle gestures. He gestured as if operating the handle of a spray pump. He was there to spray students with a shower of knowledge, his superior mind acting downward upon the mass – a Scotch Covenanter bent upon describing how man acts politically, hammering information into reluctant minds.
Even as recently as the 1930s and 40s, in many college classes, professors spewed forth information in lectures, students copiously took notes, memorized them, and then “recited” them back to the professor when called upon in class. Today, students can not rely on a good memory to succeed in college. Lectures, although still used in many courses, are no longer the predominant method of pedagogy, And students are longer encouraged to recite back what they hear in class or read in a textbook. Instead, students are encouraged to think for themselves – to offer their own opinions and interpretations in participatory seminars, writing assignments, and examinations.
The participatory seminar is now a fundamental part of most undergraduate and graduate programs at America’s top universities and liberal arts colleges. These seminars are small, usually consisting of a dozen or so students, and they emphasize discussion, facilitated by the professor, over lecture. The purpose of these seminars is to challenge students to articulate their views and defend them in the face of classmates and the professor, who may disagree. The format forces them to reason through issues and to think critically for themselves, not just repeat what a professor has told them or what they have read. Often, these seminars are accompanied by in-depth research and writing assignments, where students are required to engage in independently study and write a paper articulating and defending their own conclusions.
Even most lecture classes for undergraduates have some form of discussion section attached to them, to give students the opportunity to discuss for themselves the materials being presented in lecture. Like the participatory seminar, these discussion sections consist of relatively small numbers of students and, especially in the humanities and social sciences, emphasize exchanging views and developing analytical skills, not memorization and recitation.
Certain graduate programs have adopted teaching methods suited to their field of study that emphasize critical thinking over the simple transmission of information. U.S. law schools are renowned for their use of the Socratic method, where professors conduct entire classes by posing increasingly intricate and subtle questions to students. Another example, undoubtedly well known to this audience, is the Harvard Business School’s use of the “case method.” In this pedagogical approach, students read “cases” describing real-life business problems in their surrounding factual context. Instead of lecturing, professors use a case as the basis for class discussion, probing the students to explain how and why they would react to the situation described in the reading.
Today, professors also encourage critical thinking by the form of writing assignments they require and by the kind of examination questions they ask. In the mid-1980s, while he was the President of Harvard, Derek Bok studied the examinations given there in various subjects since 1900. He found that, at the beginning of the century, nearly all of exam questions “merely sought to have students repeat particular facts, describe the opinions of others, or relate fixed sequences of events…. The emphasis was chiefly on memory, and students were generally spared the task of unraveling complex problems, let alone exploring questions that had no determinate answers.”4 As the century progressed, the nature of the exams changed in a way that increasingly “emphasized analysis rather than memory or description.” By 1960, according to Bok, “half of the questions in the humanities and social sciences called upon students to discuss complex problems from more than one perspective. In contrast with earlier exams, moreover, far fewer questions presupposed a single set of correct answers.”5 As Bok’s survey shows, students are expected to take from their courses not just facts, figures, and widely accepted theories, but a way of thinking – the ability to use facts and figures to support an argument and to confront one theory with another through critical analysis. The distinctive emphasis on critical thinking produces graduates who are intellectually flexible and open to new ideas, graduates equipped with curiosity and the capacity to adapt to ever-changing work environments, graduates who can convert recently discovered knowledge into innovative new products and services. By producing thinking and engaged leaders capable of thriving in the new age of information technology, American higher education prepares the nation for the challenges that we can’t even imagine today, challenges upon which continued growth and prosperity depends.
I hope that I have persuaded you that the organization of scientific research, and the pedagogical strategies used in our finest universities and colleges contribute mightily to America’s technological leadership and ultimately to its economic growth. There is doubtless some irony in all of this. Our best universities and liberal arts colleges take tremendous pride in the fact that their pursuit of knowledge is free from political or economic interference. For the most part, universities conduct scientific research without a concern for potential commercial application, and liberal education seeks not to train business and professional men and women, but to produce inquisitive, thinking, creative citizens. Still, the research and teaching done in American universities have a profound and hugely positive effect on practical affairs.
This recognition should not permit us to become complacent. To the contrary, we must keep extending the frontiers of science and improving the efficacy of our pedagogy. We do not take these responsibilities lightly. We know that in no small measure the fate of our students, the nation, and the global economy depends on us.
1 Gerhard Casper, “The Advantage of the Research-Intensive University: The University of the 21st Century,” Remarks at the Peking University Centennial, Beijing, People’s Republic of China, May 3, 1998. 2 John Henry Cardinal Newman, The Idea of a University, ed. Martin J. Svaglic (Notre Dame: University of Notre Dame Press, 1960), Discourse V. 3 Alfred Pearce Dennis, “Princeton Schoolmaster,” in Houston Peterson, ed., Great Teachers, Portrayed by Those Who Studied under Them 134 (New Brunswick, NJ: Rutgers Univ. Press, 1946). 4 Derek Bok, Higher Learning 48-49 (Cambridge, Mass.: Harvard Univ. Press, 1986). 5 Id. at 49.