In American innovation through the ages, Jamie Powell wrote:

who hasn’t finished a non-fiction book and thought “Gee, that could have been half the length and just as informative. If that.” Yet every now and then you read something that provokes the exact opposite feeling. Where all you can do after reading a tweet, or an article, is type the subject into Google and hope there’s more material out there waiting to be read. So it was with Alphaville this Tuesday afternoon reading a research paper from last year entitled The changing structure of American innovation: Some cautionary remarks for economic growth by Arora, Belenzon, Patacconi and Suh (h/t to KPMG’s Ben Southwood, who highlighted it on Twitter). The exhaustive work of the Duke University and UEA academics traces the roots of American academia through the golden age of corporate-driven research, which roughly encompasses the postwar period up to Ronald Reagan’s presidency, before its steady decline up to the present day.

Arora et al argue that a cause of the decline in productivity is that:

The past three decades have been marked by a growing division of labor between universities focusing on research and large corporations focusing on development. Knowledge produced by universities is not often in a form that can be readily digested and turned into new goods and services. Small firms and university technology transfer offices cannot fully substitute for corporate research, which had integrated multiple disciplines at the scale required to solve significant technical problems.

As someone with many friends who worked at the legendary corporate research labs of the past, including Bell Labs and Xerox PARC, and who myself worked at Sun Microsystems' research lab, this is personal. Below the fold I add my 2c-worth to Arora et al's extraordinarily interesting article.

The authors provide a must-read, detailed history of the rise and fall of corporate research labs. I lived through their golden age; a year before I was born the transistor was invented at Bell Labs:

The first working device to be built was a point-contact transistor invented in 1947 by American physicists John Bardeen and Walter Brattain while working under William Shockley at Bell Labs. They shared the 1956 Nobel Prize in Physics for their achievement.[2] The most widely used transistor is the MOSFET (metal–oxide–semiconductor field-effect transistor), also known as the MOS transistor, which was invented by Egyptian engineer Mohamed Atalla with Korean engineer Dawon Kahng at Bell Labs in 1959.[3][4][5] The MOSFET was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[6]

Untitled

Before I was 50 Bell Labs had been euthanized as part of the general massacre of labs:

Bell Labs had been separated from its parent company AT&T and placed under Lucent in 1996; Xerox PARC had also been spun off into a separate company in 2002. Others had been downsized: IBM under Louis Gerstner re-directed research toward more commercial applications in the mid-90s ... A more recent example is DuPont’s closing of its Central Research & Development Lab in 2016. Established in 1903, DuPont research rivaled that of top academic chemistry departments. In the 1960s, DuPont’s central R&D unit published more articles in the Journal of the American Chemical Society than MIT and Caltech combined. However, in the 1990s, DuPont’s attitude toward research changed and after a gradual decline in scientific publications, the company’s management closed its Central Research and Development Lab in 2016.

Arora et al point out that the rise and fall of the labs coincided with the rise and fall of anti-trust enforcement:

Historically, many large labs were set up partly because antitrust pressures constrained large firms’ ability to grow through mergers and acquisitions. In the 1930s, if a leading firm wanted to grow, it needed to develop new markets. With growth through mergers and acquisitions constrained by anti-trust pressures, and with little on offer from universities and independent inventors, it often had no choice but to invest in internal R&D. The more relaxed antitrust environment in the 1980s, however, changed this status quo. Growth through acquisitions became a more viable alternative to internal research, and hence the need to invest in internal research was reduced.

Lack of anti-trust enforcement, pervasive short-termism, driven by Wall Street's focus on quarterly results, and management's focus on manipulating the stock price to maximize the value of their options killed the labs:

Large corporate labs, however, are unlikely to regain the importance they once enjoyed. Research in corporations is difficult to manage profitably. Research projects have long horizons and few intermediate milestones that are meaningful to non-experts. As a result, research inside companies can only survive if insulated from the short-term performance requirements of business divisions. However, insulating research from business also has perils. Managers, haunted by the spectre of Xerox PARC and DuPont’s “Purity Hall”, fear creating research organizations disconnected from the main business of the company. Walking this tightrope has been extremely difficult. Greater product market competition, shorter technology life cycles, and more demanding investors have added to this challenge. Companies have increasingly concluded that they can do better by sourcing knowledge from outside, rather than betting on making game-changing discoveries in-house.

They describe the successor to the labs as:

a new division of innovative labor, with universities focusing on research, large firms focusing on development and commercialization, and spinoffs, startups, and university technology licensing offices responsible for connecting the two.

An unintended consequence of abandoning anti-trust enforcement was thus a slowing of productivity growth, because the this new division of labor wasn't as effective as the labs:

The translation of scientific knowledge generated in universities to productivity enhancing technical progress has proved to be more difficult to accomplish in practice than expected. Spinoffs, startups, and university licensing offices have not fully filled the gap left by the decline of the corporate lab. Corporate research has a number of characteristics that make it very valuable for science-based innovation and growth. Large corporations have access to significant resources, can more easily integrate multiple knowledge streams, and direct their research toward solving specific practical problems, which makes it more likely for them to produce commercial applications. University research has tended to be curiosity-driven rather than mission-focused. It has favored insight rather than solutions to specific problems, and partly as a consequence, university research has required additional integration and transformation to become economically useful.

In Sections 5.1.1 through 5.1.4 Arora et al discuss in detail four reasons why the corporate labs drove faster productivity growth:

  1. Corporate labs work on general purpose technologies. Because the labs were hosted the leading companies in their market, they believed that technologies that benefited their product space would benefit them the most: My experience of Open Source supports this. Sun was the leading player in the workstation market and was happy to publish and open source infrastructure technologies such as NFS that would buttress that position. On the desktop it was not a dominant player, which (sadly) led to NeWS being closed-source.

    Claude Shannon’s work on information theory, for instance, was supported by Bell Labs because AT&T stood to benefit the most from a more efficient communication network ... IBM supported milestones in nanoscience by developing the scanning electron microscope, and furthering investigations into electron localization, non-equilibrium superconductivity, and ballistic electron motions because it saw an opportunity to pre-empt the next revolutionary chip design in its industry ... Finally, a recent surge in corporate publications in Machine Learning suggests that larger firms such as Google and Facebook that possess complementary assets (user data) for commercialization publish more of their research and software packages to the academic community, as they stand to benefit most from advances in the sector in general.