How did the government transition from funding science for the war effort to this long-term commitment to university research?
McCray: The year before President Roosevelt died in 1945, he tasked his science adviser, a man named Vannevar Bush, to look to the future. Bush, who had been at MIT before taking over the management of America’s vast wartime science infrastructure, eventually oversaw the production of this famous report called “Science: the Endless Frontier.” It laid out a blueprint for what would become U.S. science policy in the years and decades following the Second World War.
Did Bush and his successors articulate any specific goals for these policies?
Carson: You might think that the federal government is most interested in applicable research that immediately leads to new weapons or new products. But federal leaders realized that they were actually not just investing in the products of research. They were investing in the people.
McCray: They recognized we needed to have a cadre of trained scientists and engineers and need to keep them fed and paid until the next conflict eventually breaks out. Scientists were seen as a resource to be stockpiled, like steel or oil, and that we can turn to in time of a national emergency.
By the 1960s, the federal government was spending about 2% of U.S. GDP on research and development. How have elected officials made the case for these investments to U.S. taxpayers?
McCray: Bush would say, “We need to water the tree of basic research.” The idea was that tree will grow nice little fruits we can come along and pluck, and those would benefit our health, economy and security.
Those three things — health, economy and national security — were part of the social contract that emerged between scientists and the federal government after the Second World War. The idea was that in some way, the research that the government is funding would contribute to the larger benefit of the nation.
What are some examples of those fruits of basic research?
McCray: I tell my students about Tom Brock, a microbial ecologist in the 1960s who was really interested in the microbes in the hot springs at Yellowstone National Park. The bacteria that he discovered became the key part in a biological technique developed in the 1980s called the polymerase chain reaction, which allows you to amplify sequences of DNA. PCR was a huge step in the creation of the whole biotech industry, and it was ultimately a critical tool used in 2020 to develop a vaccine for COVID. You can’t predict that path, and the time frame for these government investments paying off is often measured in decades. But Vannevar Bush would have argued that that’s exactly why the federal government should be the one investing in basic science: because industry was never going to think or work that way.
Carson: Silicon Valley was built on microelectronics and aerospace, both funded by the Defense Department. Electronics weren’t initially for consumers. They were for ballistic missiles, jet aircraft, the next generation of radar. All this effort went into building electronics that would serve the military then got turned over to the consumer market in the 1970s and '80s.
Presumably the U.S. wasn’t the only nation that recognized the value of investing in science after World War II?
Carson: No, and in fact, other global powers, including the nations defeated in World War II, started to catch up. Power brokers in Washington in 1948 could never have imagined that the Soviets would get an atomic bomb by 1949. Germany and Japan both made strides in advanced manufacturing in the 1950s. In the 1960s, we had thought we had a semi-permanent lead in semiconductors, but by the 1970s, Japan emerged as a leader in microelectronics.
So that’s how the main concern of government-funded science expanded by the 1970s and 1980s, from maintaining national defense to maintaining U.S. global economic leadership. It became clear that any lead the U.S. might hold — in defense, in electronics, in biotech — has to be constantly defended.
For everyday Americans, why does it matter which nation’s scientists invent the technology or cure the disease, as long as someone somewhere is solving these problems?
Carson: There are two ways to think about that, and they both have to do with maintaining U.S. economic preeminence. One is the “first mover” advantage: Sure, a company from another country could go and commercialize a technology they didn’t originally develop, but they’d be doing that sometime after the originator, so the originator has the chance to build up a lead.
Also, so much of scientific research isn’t about just discoveries, but it's about making an initial discovery better, more marketable or more effective. And so having a system of innovation that can play at all stages, from invention through commercialization of the final product, helps keep domestic companies in the lead over global competitors.