Repost: We Need More ‘Useless’ Knowledge

by The Philosophical Fish

We Need More ‘Useless’ Knowledge

MARCH 02, 2017 
Stuart Bradford for The Chronicle Review

On April 30, 1939, under the gathering storm clouds of war, the New York World’s Fair opened in Flushing Meadows, Queens. Its theme was The World of Tomorrow. Over the next 18 months, nearly 45 million visitors would be given a peek into a future shaped by newly emerging technologies. Some of the displayed innovations were truly visionary. The fair featured the first automatic dishwasher, air conditioner, and fax machine. The live broadcast of President Franklin Roosevelt’s opening speech introduced America to television. Newsreels showed Elektro the Moto-Man, a seven-foot tall, awkwardly moving aluminum robot that could speak by playing 78-r.p.m. records, smoke a cigarette, and play with his robot dog Sparko. Other attractions, such as a pageant featuring magnificent steam-powered locomotives, could be better characterized as the last gasps of the world of yesterday.

Albert Einstein, honorary chair of the fair’s science advisory committee, presided over the official illumination ceremony, also broadcast live on television. He spoke to a huge crowd on the topic of cosmic rays, highly energetic subatomic particles bombarding the Earth from outer space. But two scientific discoveries that would soon dominate the world were absent at the fair: nuclear energy and electronic computers.

The very beginnings of both technologies, however, could be found at an institution that had been Einstein’s academic home since 1933: the Institute for Advanced Study in Princeton, N.J. The institute was the brainchild of its first director, Abraham Flexner. Intended to be a “paradise for scholars” with no students or administrative duties, it allowed its academic stars to fully concentrate on deep thoughts, as far removed as possible from everyday matters and practical applications. It was the embodiment of Flexner’s vision of the “unobstructed pursuit of useless knowledge,” which would only show its use over many decades, if at all.

By setting up his academic paradise, Flexner enabled the nuclear and digital revolutions.

However, the unforeseen usefulness came much faster than expected. By setting up his academic paradise, Flexner enabled the nuclear and digital revolutions. Among his first appointments was Einstein, who would follow his speech at the World’s Fair with his famous letter to President Roosevelt in August 1939, urging him to start the atomic-bomb project. Another early Flexner appointee was the Hungarian mathematician John von Neumann, perhaps an even greater genius than Einstein.Von Neumann’s early reputation was based on his work in pure mathematics and the foundations of quantum theory. Together with the American logician Alonzo Church, he made Princeton a center for mathematical logic in the 1930s, attracting such luminaries as Kurt Gödel and Alan Turing. Von Neumann was fascinated by Turing’s abstract idea of a universal calculating machine that could mechanically prove mathematical theorems. When the nuclear bomb program required large-scale numeric modeling, von Neumann gathered a group of engineers at the institute to begin designing, building, and programming an electronic digital computer — the physical realization of Turing’s universal machine.

Von Neumann also directed his team to focus these new computational powers on many other problems aside from weapons. With the meteorologist Jule Charney, he made the first numerical weather prediction in 1949 — technically it was a “postdiction,” since at that time it took 48 hours to predict tomorrow’s weather. Anticipating our present climate-change reality, von Neumann would write about the study of Earth’s weather and climate: “All this will merge each nation’s affairs with those of every other, more thoroughly than the threat of a nuclear or any other war may already have done.”

A logical machine that can prove mathematical theorems or a highly technical paper on the structure of the atomic nucleus may seem to be useless endeavors. In fact, they played important roles in developing technologies that have revolutionized our way of life beyond recognition. These curiosity-driven inquiries into the foundations of matter and calculation led to the development of nuclear arms and digital computers, which in turn permanently upset the world order, both militarily and economically. Rather than attempting to demarcate the nebulous and artificial distinction between “useful” and “useless” knowledge, we may follow the example of the British chemist and Nobel laureate George Porter, who spoke instead of applied and “not-yet-applied” research.

Supporting applied and not-yet-applied research is not just smart but a social imperative. In order to enable and encourage the full cycle of scientific innovation, which feeds into society in numerous important ways, it is more productive to think of developing a solid portfolio of research in much the same way as we approach well-managed financial resources. Such a balanced portfolio would contain predictable and stable short-term investments, as well as long-term bets that are intrinsically more risky but can potentially earn off-the-scale rewards. A healthy and balanced ecosystem would support the full spectrum of scholarship, nourishing a complex web of interdependencies and feedback loops.

However, our current research climate, governed by imperfect “metrics” and policies, obstructs this prudent approach. Driven by an ever-deepening dearth of funding, against a background of economic uncertainty, global political turmoil, and ever-shortening time cycles, research criteria are becoming dangerously skewed toward conservative short-term goals that may address more immediate problems but miss out on the huge advances that human imagination can bring in the long term. Just as in Flexner’s time, the progress of our modern age, and of the world of tomorrow, depends not only on technical expertise, but also on unobstructed curiosity and the benefits — and pleasures — of traveling far upstream, against the current of practical considerations.

Who was Abraham Flexner, and how did he arrive at his firm beliefs in the power of unfettered scholarship? Born in 1866 in Louisville, Ky., Flexner was one of nine children of Jewish immigrants from Bohemia. In spite of sudden economic hardship — the Flexners lost their business in the panic of 1873 — and with the help of his older brother Jacob, Abraham was able to attend the Johns Hopkins University, arguably the first modern research university in the United States. Flexner’s exposure to the advanced opportunities at Hopkins, which were comparable to those at leading foreign universities, permanently colored his views. He remained a lifelong critic and reformer of teaching and research. After obtaining his bachelor’s degree in classics in just two years, he returned to Louisville, where he started a college preparatory school to implement his revolutionary ideas based on a deep confidence in the creative powers of the individual and an equally deep distrust of the ability of institutions to foster such talent.

Flexner first rose to public attention in 1908 with his book The American College: A Criticism, with a strong appeal for hands-on teaching in small classes. His main claim to fame was his 1910 bombshell report, commissioned by the Carnegie Foundation, on the state of 155 medical schools in North America, branding many of them as frauds and irresponsible profit machines that withheld from students any practical training. The report led to the closure of almost half of the medical schools and the wide reform of others, starting the age of modern biomedical teaching and research in the United States.

Flexner’s efforts and vision led to his joining the General Education Board of the Rockefeller Foundation in 1912, lending him added stature and resources as an influential force in higher education and philanthropy. He soon became its executive secretary, a position he held until his retirement in 1927. It was in this capacity that he formed the ideas underlying his essay “The Usefulness of Useless Knowledge.” It would eventually be published in Harper’s magazine in October 1939, but it began as a 1921 internal memo prepared for the board.

Flexner was given the opportunity to put his lofty vision into practice when he was approached in 1929 by representatives of Louis Bamberger and his sister Caroline Bamberger Fuld. The Bambergers had sold their massive, eponymous Newark department store to Macy’s a few weeks before the Wall Street crash, leaving them with a large fortune. Their original intent was to found a medical institution without racial, religious, or ethnic biases, but Flexner persuaded the benefactors to set up an institute exclusively dedicated to unrestricted scholarship. In 1930, he became the founding director of the Institute for Advanced Study in Princeton. When Flexner died in 1959 at age 92, his obituary appeared on the front page of The New York Times along with an editorial concluding, “No other American of his time has contributed more to the welfare of this country and of humanity in general.”

It was Flexner’s lifelong conviction that human curiosity, with the help of serendipity, was the only force strong enough to break through the mental walls that block truly transformative ideas and technologies. He believed that only with the benefit of hindsight could the long arcs of knowledge be discerned, often starting with unfettered inquiry and ending in practical applications.

In his essay, Flexner articulates well the effect of the groundbreaking investigations into the nature of electromagnetism by Michael Faraday and James Clerk Maxwell. Recall that the year 1939 saw the introduction of FM radio and television to the United States. On the wall of Einstein’s home office hung small portraits of these two British physicists. It is hard to think of any human endeavor today that doesn’t make use of electricity or wireless communication. Over a century and a half, almost all aspects of our lives have literally been electrified.

Within a hundred years, an esoteric theory of young physicists became a mainstay of the modern economy.

In the same way, in the early 20th century the study of the atom and the development of quantum mechanics were seen as a theoretical playground for a handful of often remarkably young physicists. The birth of quantum theory was long and painful. The German physicist Max Planck described his revolutionary thesis, first proposed in 1900, that energy could only occur in packets or “quanta” as “an act of desperation.” In his words, “I was willing to make any offer to the principles in physics that I then held.” His gambit played out very well. Without quantum theory, we wouldn’t understand the nature of any material, including its color, texture, and chemical and nuclear properties. These days, in a world totally dependent on microprocessors, lasers, and nanotechnology, it has been estimated that 30 percent of the U.S. gross national product is based on inventions made possible by quantum mechanics. With the booming high-tech industry and the expected advent of quantum computers, this percentage will only grow. Within a hundred years, an esoteric theory of young physicists became a mainstay of the modern economy.It took nearly as long for Einstein’s own theory of relativity, first published in 1905, to be used in everyday life in an entirely unexpected way. The accuracy of the global positioning system, the space-based navigation system that provides location and time information in today’s mobile society, depends on reading time signals of orbiting satellites. The presence of Earth’s gravitational field and the movement of these satellites cause clocks to speed up and slow down, shifting them by 38 microseconds a day. In one day, without Einstein’s theory, our GPS tracking devices would be inaccurate by about seven miles.

The path from exploratory blue-sky research to practical applications is not one-directional and linear, but rather complex and cyclic, with resultant technologies enabling even more fundamental discoveries. Take, for example, superconductivity, the phenomenon discovered by the Dutch physicist Heike Kamerlingh Onnes in 1911. Certain materials, when cooled down to ultralow temperatures, turn out to conduct electricity without any resistance, allowing large electric currents to flow at no energy costs. The powerful magnets that can be so constructed have led to many innovative applications, from the maglev transport technology that allows trains to travel at very high speeds as they levitate through magnetic fields to the fMRI technology used to make detailed brain scans for diagnosis and treatment.

Through these breakthrough technologies, superconductivity has in turn pushed the frontiers of basic research in many directions. High-precision scanning has made possible the flourishing field of present-day neuroscience, which is probing the deepest questions about human cognition and consciousness. Superconductivity is playing a crucial role in the development of quantum computers and the next revolution in our computational powers with unimaginable consequences. And in fundamental physics, it has produced the largest and strongest magnets on the planet, buried a hundred meters underground in the 17-miles-long ring of the Large Hadron Collider, the particle accelerator built in the CERN laboratory in Geneva. The resulting 2012 discovery of the Higgs boson was the capstone that completed the Standard Model of particle physics, enabling physicists to further probe and unravel the mysteries of the universe. Note that the deep understanding of the Higgs particle is itself based on the theory of superconductivity. There is therefore an evident route from the discovery of superconductivity to the discovery of the Higgs particle one century later. But it is hardly a straight one, going instead through many loops.

It is in the life sciences that we find perhaps the most powerful practical implications of fundamental discoveries. One of the least-known success stories in human history is how over the past two-and-a-half centuries advances in medicine and hygiene have tripled life expectancy in the West. The discovery of the double helical structure of DNA in 1953 jump-started the age of molecular biology, unraveling the genetic code and the complexity of life. The advent of recombinant DNA technology in the 1970s and the completion of the Human Genome Project in 2003 revolutionized pharmaceutical research and created the modern biotech industry. Currently, the Crispr-Cas9 technology for gene editing allows scientists to rewrite the genetic code with unbounded potential for preventing and treating diseases and improving agriculture and food security. We should never forget that these groundbreaking discoveries, with their immense consequences for health and diseases, were products of addressing deep basic questions about living systems, without any thoughts of immediate applications.

The absolute necessity of basic research has only become more obvious since Flexner’s time. As Flexner argues so elegantly, basic research clearly advances knowledge in and of itself. Fundamental inquiry produces ideas that slowly and steadily turn into concrete applications and further studies. As is often said, knowledge is the only resource that increases when used.

The path from exploratory blue-sky research to practical applications is not one-directional and linear, but rather complex and cyclic.

Pathbreaking research also leads to new tools and techniques, often in unpredictable and indirect ways. A remarkable late-20th-century example of such a fortuitous outgrowth was the development of automatic information-sharing software, introduced as the World Wide Web in 1989. What began as a collaboration tool for thousands of physicists working at the CERN particle accelerator laboratory entered the public domain in 1993, unleashing the power of the internet to the masses and facilitating large-scale communication around the globe. To store and process the vast amount of data produced in the same particle experiments, so-called grid and cloud computing were developed, linking computers in huge virtual networks. These cloud technologies now drive many internet business applications, from services and shopping to entertainment and social media.Curiosity-driven research attracts the world’s best minds. Young scientists and scholars, drawn to the intellectual challenges of fundamental questions, are trained in completely new ways of thinking and using technology. Once these skills carry over to society, they can have transformative effects. For example, scientists who have learned to capture complex natural phenomena in elegant mathematical equations apply these techniques to other branches of society and industry, such as in the quantitative analysis of financial and social data.

Much of the knowledge developed through basic research is made publicly accessible and so benefits society as a whole, spreading widely beyond the narrow circle of individuals who, over years and decades, introduce and develop the ideas. Fundamental advances in knowledge cannot be owned or restricted by people, institutions, or nations, certainly not in the internet age. They are truly public goods.

Finally, pathbreaking research results in start-up companies. The new industrial players of the past decades show how powerful technologies are in generating commercial activities. It is estimated that more than half of all economic growth comes from innovation. Leading information technology and biotech industries can trace their success directly to the fruits of fundamental research grown in the fertile environments around research universities as in Silicon Valley and the Boston area, often infused by generous public investments. MIT estimates that it has given rise to more than 30,000 companies with roughly 4.6 million employees, including giants such as Texas Instruments, McDonnell Douglas, and Genentech. The two founders of Google worked as graduate students at Stanford University on a project supported by the Digital Libraries Initiative of the National Science Foundation.

The postwar decades saw an unprecedented worldwide growth of science, including the creation of funding councils like the National Science Foundation and massive investments in research infrastructure. Recent decades have seen a marked retrenchment. One can argue that the state of scholarship has now reached a critical stage that in many ways mirrors the crisis that Flexner discussed. Steadily declining public funding is currently insufficient to keep up with the expanding role of the scientific enterprise in a modern knowledge-based society. The U.S. federal research and development budget, measured as a fraction of the gross domestic product, has steadily declined, from a high of 2.1 percent in 1964, at the height of the Cold War and the space race, to currently less than .8 percent. (Note that roughly half of that budget has remained defense oriented.) The budget for the National Institutes of Health, the largest supporter of medical research in the United States, has fallen by 25 percent in inflation-adjusted dollars over the past decade.

On top of this, industry, driven by short-term shareholder pressure, has been steadily decreasing its research activities, transferring that responsibility largely to public funding and private philanthropy. A committee of the U.S. Congress found that in 2012 business only provided 6 percent of basic research funding, with the lion’s share — 53 percent — shouldered by the federal government and the remainder coming from universities and foundations.

Success rates in grant applications for basic research are plummeting across all disciplines, particularly for early-career researchers. Life scientists can now expect their first National Institutes of Health grants only in their mid-40s. Apart from discouraging the next generation of talented scholars, this lack of opportunities has led to a much more outcome-driven approach to funding, with granting institutions less willing to place risky long bets.

Nobody is in a better position than working scientists and scholars to convey to the public the value of basic research. We should be encouraged by the public fascination with the big questions that science raises, however far removed from everyday concerns. How did the universe begin and how does it end? What is the origin of life on Earth and possibly elsewhere in the cosmos? What in our brain makes us conscious and human? What will the world of tomorrow bring?

A broad-ranging dialogue between science and society is not only necessary for laying the foundation for future financial support. It is crucial for attracting young minds to join the research effort. Well-informed, science-literate citizens are better able to make responsible choices when confronted with difficult problems like climate change, nuclear power, vaccinations, and genetically modified foods. Similarly, scientists need the dialogue with society to act responsibly in developing potentially harmful technologies. And there is an even higher goal for the public engagement of science: Society fundamentally benefits from embracing the scientific culture of accuracy, truth seeking, critical questioning, healthy skepticism, respect for facts and uncertainties, and wonder at the richness of nature and the human spirit.

Robbert Dijkgraaf, a mathematical physicist, is director and Leon Levy Professor at the Institute for Advanced Study. This is adapted from his companion essay in Princeton University Press’s republication of Abraham Flexner’s The Usefulness of Useless Knowledge.

Leave a Comment