Using Smolin's analogy of hill climbing, the dominant strategy today in science is:
1) self-assess own climbing ability
2) choose suitable hill (perhaps inherited from advisor!)
3) climb to local maximum (write some relevant papers with incremental results)
4) squat on hilltop and defend against all attackers (make sure everyone cites your papers; get embedded in small community of researchers defending that hill)
5) train students and postdocs on your hilltop while secretly wishing you understood what other people were doing on their hilltops -- suppressing the curiosity that originally got you into science.
From personal experience, I can tell you that when you leave your little hill to cross a valley and explore somewhere else, the citations of your previous work will plummet, inhabitants of other hills will try to repel you, and funding agencies will ask why you aren't doing mainstream stuff ("he's not serious -- he keeps jumping around"). Based on this incentive system, it is easy to understand why people behave as they do.
...returns on research investment do not arrive steadily and predictably, but erratically and unpredictably, in a manner akin to intellectual earthquakes. Indeed, this idea seems to be more than merely qualitative. Data on human innovation, whether in basic science or technology or business, show that developments emerge from an erratic process with wild unpredictability. For example, as physicist Didier Sornette of the ETH in Zurich and colleagues showed a few years ago, the statistics describing the gross revenues of Hollywood movies over the past 20 years does not follow normal statistics but a power-law curve — closely resembling the famous Gutenberg— Richter law for earthquakes — with a long tail for high-revenue films. A similar pattern describes the financial returns on new drugs produced by the bio-tech industry, on royalties on patents granted to universities, or stock-market returns from hi-tech start-ups.
What we know of processes with power-law dynamics is that the largest events are hugely disproportionate in their consequences. In the metaphor of Nassim Nicholas Taleb’s 2007 best seller The Black Swan, it is not the normal events, the mundane and expected “white swans” that matter the most, but the outliers, the completely unexpected “black swans”. In the context of history, think 11 September 2001 or the invention of the Web. Similarly, scientific history seems to pivot on the rare seismic shifts that no-one predicts or even has a chance of predicting, and on those utterly profound discoveries that transform worlds. They do not flow out of what the philosopher of science Thomas Kuhn called “normal science” — the paradigm-supporting and largely mechanical working out of established ideas — but from “revolutionary”, disruptive and risky science.
Squeezing life out of innovation
All of which, as Sornette has been arguing for several years, has important implications for how we think about and judge research investments. If the path to discovery is full of surprises, and if most of the gains come in just a handful of rare but exceptional events, then even judging whether a research programme is well conceived is deeply problematic. “Almost any attempt to assess research impact over a finite time”, says Sornette, “will include only a few major discoveries and hence be highly unreliable, even if there is a true long-term positive trend.”
This raises an important question: does today’s scientific culture respect this reality? Are we doing our best to let the most important and most disruptive discoveries emerge? Or are we becoming too conservative and constrained by social pressure and the demands of rapid and easily measured returns? The latter possibility, it seems, is of growing concern to many scientists, who suggest that modern science is in danger of losing its creativity unless we can find a systematic way to build a more risk-embracing culture.
The voices making this argument vary widely. For example, the physicist Geoffrey West, who is currently president of the Santa Fe Institute (SFI) in New Mexico, US, points out that in the years following the Second World War, US industry created a steady stream of paradigm-changing innovations, including the transistor and the laser, and it happened because places such as Bell Labs fostered a culture of enormously free innovation. “They brought together serious scientists — physicists, engineers and mathematicians — from across disciplines”, says West, “and created a culture of free thinking without which it’s hard to imagine how these ideas could have come about.”
Unfortunately, today’s academic and corporate cultures seem to be moving in the opposite direction, with practices that stifle risk-taking mavericks who have a broad view of science. At universities and funding agencies, for example, tenure and grant committees take decisions based on narrow criteria (focusing on publication lists, citations and impact factors) or on specific plans for near-term results, all of which inherently favour those working in established fields with well-accepted paradigms. In recent years, tightening business practices and efforts to improve efficiency have also driven corporations in a similar direction. “That may be fine in the accounting department,” says West, “but it’s squeezing the life out of innovation.”
...But physicist Lee Smolin, currently at the Perimeter Institute, suggests that science overall requires a much broader and more coherent approach to risky science. To see the kinds of policies needed, he suggests, it is useful to note that scientists, at least in some rough approximation, follow working styles of two very different kinds, which mirror Kuhn’s distinction between normal and revolutionary science.
Some scientists, he suggests, are what we might call “hill climbers”. They tend to be highly skilled in technical terms and their work mostly takes established lines of insight that pushes them further; they climb upward into the hills in some abstract space of scientific fitness, always taking small steps to improve the agreement of theory and observation. These scientists do “normal” science. In contrast, other scientists are more radical and adventurous in spirit, and they can be seen as “valley crossers”. They may be less skilled technically, but they tend to have strong scientific intuition — the ability to spot hidden assumptions and to look at familiar topics in totally new ways.
To be most effective, Smolin argues, science needs a mix of hill climbers and valley crossers. Too many hill climbers doing normal science, and you end up sooner or later with lots of them stuck on the tops of local hills, each defending their own territory. Science then suffers from a lack of enough valley crossers able to strike out from those intellectually tidy positions to explore further away and find higher peaks.