Tuesday, May 30, 2006

Medical expertise

Previously I posted on the myth of expertise. Experts in many fields often perform no better than chance when making predictions.

This Businessweek article claims that no more than 25% of medical recommendations are supported by statistical data. Most of it is done by tradition, or according the doctor's intuition. The article profiles David Eddy, a heart surgeon who spent some time learning mathematics and earned a PhD in operations research. He revolutionized the field by pushing for "evidence-based" medicine. What is amazing to me is that this took so long and happened so recently.

...Even today, with a high-tech health-care system that costs the nation $2 trillion a year, there is little or no evidence that many widely used treatments and procedures actually work better than various cheaper alternatives.

This judgment pertains to a shocking number of conditions or diseases, from cardiovascular woes to back pain to prostate cancer. During his long and controversial career proving that the practice of medicine is more guesswork than science, Eddy has repeatedly punctured cherished physician myths. He showed, for instance, that the annual chest X-ray was worthless, over the objections of doctors who made money off the regular visit. He proved that doctors had little clue about the success rate of procedures such as surgery for enlarged prostates. He traced one common practice -- preventing women from giving birth vaginally if they had previously had a cesarean -- to the recommendation of one lone doctor. Indeed, when he began taking on medicine's sacred cows, Eddy liked to cite a figure that only 15% of what doctors did was backed by hard evidence.

A great many doctors and health-care quality experts have come to endorse Eddy's critique. And while there has been progress in recent years, most of these physicians say the portion of medicine that has been proven effective is still outrageously low -- in the range of 20% to 25%. "We don't have the evidence [that treatments work], and we are not investing very much in getting the evidence," says Dr. Stephen C. Schoenbaum, executive vice-president of the Commonwealth Fund and former president of Harvard Pilgrim Health Care Inc. "Clearly, there is a lot in medicine we don't have definitive answers to," adds Dr. I. Steven Udvarhelyi, senior vice-president and chief medical officer at Pennsylvania's Independence Blue Cross.

What's required is a revolution called "evidence-based medicine," says Eddy, a heart surgeon turned mathematician and health-care economist. Tall, lean, and fit at 64, Eddy has the athletic stride and catlike reflexes of the ace rock climber he still is. He also exhibits the competitive drive of someone who once obsessively recorded his time on every training run, and who still likes to be first on a brisk walk up a hill near his home in Aspen, Colo. In his career, he has never been afraid to take a difficult path or an unpopular stand. "Evidence-based" is a term he coined in the early 1980s, and it has since become a rallying cry among medical reformers. The goal of this movement is to pierce the fog that envelops the practice of medicine -- a state of ignorance for which doctors cannot really be blamed. "The limitation is the human mind," Eddy says. Without extensive information on the outcomes of treatments, it's fiendishly difficult to know the best approach for care.

The human brain, Eddy explains, needs help to make sense of patients who have combinations of diseases, and of the complex probabilities involved in each. To provide that assistance, Eddy has spent the past 10 years leading a team to develop the computer model that helped him crack the diabetes puzzle. Dubbed Archimedes, this program seeks to mimic in equations the actual biology of the body, and make treatment recommendations as well as figure out what each approach costs. It is at least 10 times "better than the model we use now, which is called thinking," says Dr. Richard Kahn, chief scientific officer at the American Diabetes Assn.

WASTED RESOURCES

Can one computer program offset all the ill-advised treatment options for a whole range of different diseases? The milestones in Eddy's long personal crusade highlight the looming challenges, and may offer a sliver of hope. Coming from a family of four generations of doctors, Eddy went to medical school "because I didn't know what else to do," he confesses. As a resident at Stanford Medical Center in the 1970s, he picked cardiac surgery because "it was the biggest hill -- the glamour field."

But he soon became troubled. He began to ask if there was actual evidence to support what doctors were doing. The answer, he was surprised to hear, was no. Doctors decided whether or not to put a patient in intensive care or use a combination of drugs based on their best judgment and on rules and traditions handed down over the years, as opposed to real scientific proof. These rules and judgments weren't necessarily right. "I concluded that medicine was making decisions with an entirely different method from what we would call rational," says Eddy.

About the same time, the young resident discovered the beauty of mathematics, and its promise of answering medical questions. In just a couple of days, he devoured a calculus textbook (now framed on a shelf in his beautifully appointed home and office), then blasted through the books for a two-year math course in a couple of months. Next, he persuaded Stanford to accept him in a mathematically intense PhD program in the Engineering-Economics Systems Dept. "Dave came in -- just this amazing guy," recalls Richard Smallwood, then a Stanford professor. "He had decided he wanted to spend the rest of his life bringing logic and rationality to the medical system, but said he didn't have the math. I said: 'Why not just take it?' So he went out and aced all those math courses."

To augment his wife's earnings while getting his PhD, Eddy landed a job at Xerox Corp.'s (XRX ) legendary Palo Alto Research Center. "They hired weird people," he says. "Here was a heart surgeon doing math. That was weird enough."

Eddy used his newfound math skills to model cancer screening. His Stanford PhD thesis made front-page news in 1980 by overturning the guidelines of the time. It showed that annual chest X-rays and yearly Pap smears for women at low risk of cervical cancer were a waste of resources, and it won the most prestigious award in the field of operations research, the Frederick W. Lanchester prize. Based on his results, the American Cancer Society changed its guidelines. "He's smart as hell, with a towering clarity of thought," says Stanford health economist Allan Enthoven.

Dr. William H. Herman, director of the Michigan Diabetes Research & Training Center, has a competing computer model that clashes with Eddy's. Nonetheless, he says, "Dr. Eddy is one of my heroes. He's sort of the father of health economics -- and he might be right."

..."At each meeting I would do the same exercise," he says. He would ask doctors to think of a typical patient and typical treatment, then write down the results of that treatment. For urologists, for instance, what were the chances that a man with an enlarged prostate could urinate normally after having corrective surgery? Eddy then asked the society's president to read the predictions.

The results were startling. The predictions of success invariably ranged from 0% to 100%, with no clear pattern. "All the doctors were trying to estimate the same thing -- and they all gave different numbers," he says. "I've spent 25 years proving that what we lovingly call clinical judgment is woefully outmatched by the complexities of medicine."

Friday, May 26, 2006

The big bucks

Hedge fund compensation is shocking, although I'll give credit to anyone who can actually generate alpha. The top performers deserve their money, and if the mediocre guys are overpaid, at least their investors are sophisticated enough to know better. Jim Simons and his team actually donated funds to support the Relativistic Heavy Ion Collider, which ran out of money last year. 29 percent net of fees is incredible, considering they do it consistently.

Most studies show that wealth and income inequality in the US are near all-time highs, matched only by the 1920's, just before the Great Depression.

NYTimes: Just when it seems as if things cannot get any better for the titans of investing, they get better — a lot better.

James Simons, a math whiz who founded Renaissance Technologies, made $1.5 billion in 2005, according to the survey by Alpha, a magazine published by Institutional Investor. That trumps the more than $1 billion that Edward S. Lampert, known for last year's acquisition of Sears, Roebuck, took home in 2004. (Don't fret for Mr. Lampert; he earned $425 million in 2005.) Mr. Simons's $5.3 billion flagship Medallion fund returned 29.5 percent, net of fees.

No. 2 on Alpha's list is T. Boone Pickens Jr., 78, the oilman who gained attention in the 1980's going after Gulf Oil, among other companies. He earned $1.4 billion in 2005, largely from startling returns on his two energy-focused hedge funds: 650 percent on the BP Capital Commodity Fund and 89 percent on the BP Capital Energy Equity Fund.

A representative for Mr. Simons declined to comment. Calls to Mr. Pickens's company were not returned.

The magic behind the money is the compensation structure of a hedge fund. Hedge funds, lightly regulated private investment pools for institutions and wealthy individuals, typically charge investors 2 percent of the money under management and a performance fee that generally starts at 20 percent of gains.

The stars often make a lot more than this "2 and 20" compensation setup. According to Alpha's list, Mr. Simons charges a 5 percent management fee and takes 44 percent of gains; Steven A. Cohen, of SAC Capital Advisors, charges a management fee of 1 to 3 percent and 44 percent of gains; and Paul Tudor Jones II, whose Tudor Investment Corporation has never had a down year since its founding in 1980, charges 4 percent of assets under management and a 23 percent fee.

Tuesday, May 23, 2006

OFHEO Fannie Mae smackdown

The OFHEO report is out. A $400M fine for Fannie... not a whitewash like the previous Rudman report... look for shareholder lawsuits to claw back some of Franklin Raines' $40M in compensation.

See previous posts here and here.

Sunday, May 21, 2006

More and more

Last week I had almost identical discussions with several different professors (including a former dean of the business school) about narrow specialization in academia. We all agreed that the way to get ahead is to stake out your turf in one narrow area and defend it at all costs.

I, however, specifically became a physicist in order to think about new and interesting things -- even things not traditionally considered physics! While the typical academic is someone who knows more and more about less and less, I think my motto is to learn more and more about more and more :-)

I don't think I could stand to spend all my time writing the (N+1)th paper on some speculative model (which I don't really believe to be a correct description of Nature), or on some straightforward application of known techniques, just to get citations. Instead, I'll take the quixotic path of working on totally new things every few years. But of course, as noted by everyone I talked to, I can expect only punishment for deviating from the norm!

Marcus Aurelius:
"Or does the bubble reputation distract you? Keep before your eyes the swift onset of oblivion, and the abysses of eternity before us and behind; mark how hollow are the echoes of applause, how fickle and undiscerning the judgments of professed admirers, and how puny the arena of human fame. For the entire earth is but a point, and the place of our own habitation but a minute corner in it; and how many are therein who will praise you, and what sort of men are they?"

Tuesday, May 16, 2006

The great migration

This discussion (available both as stream and podcast) is one of the best I've heard about what may be the greatest migration in history -- the movement of over 100M Chinese from rural villages to coastal cities over the last 20 years. Discussants include Peter Hessler, the New Yorker writer (and former English teacher in China) whose dispatches have appeared periodically in the magazine, Leslie Cheng, who covers China for the WSJ (I've posted some of her articles here), an academic economist and an anthropolgist who study migration.

Parallels with the industrial revolution in the UK and US are discussed. The plight of migrants in cities where they have limited legal rights... the myth of the sweatshop vs the dreary life of traditional agriculture... the transformative effects on rural China... all are important factors underlying modernization and globalization.

Comments:

1) 100M is a lot of people, but there are plenty more. Over half a billion people still work in agriculture. Although efficiency has improved in the post-Deng era, Chinese agriculture still has a long way to go. Efficiencies approaching modern levels would mean an order of magnitude fewer agricultural workers -- meaning hundreds of millions more migrants to the cities.

2) Factory work may seem unpleasant by US standards (although probably similar to what workers endured here only a century ago), but trainloads of young people in seach of work arrive in coastal cities every day. They obviously *prefer* factory work over life on the farm. The economist in the discussion notes that once migrants are in the city they often want to work as many hours as possible to make more money.

3) Independent city living may be the single greatest pro-feminist force in China. Once young women have made it on their own they are unwilling to submit again to traditional patriarchal conditions that persist in the countryside.

The World, a recent film by Jia Zhangke, does a wonderful job of capturing the urban-migrant zeitgeist -- especially the feelings of dislocation and loneliness. I recommend it highly. (Times review.)

Saturday, May 13, 2006

Universal library

Nice article in the Times updates us on how various book digitizing projects are coming along. As usual, lawyers gumming up progress :-)

I look forward in a few years to storing 10 million books (equivalent to, e.g., Widener Library at Harvard) plus an image of the entire Web on the terabyte drive of my laptop. This is completely feasible technically -- the main barriers are legal and economic. Sadly, this does suggest that I have more faith in storage technologists than superfast broadband roll out. If we had really good broadband I wouldn't have to carry any data around with me on my laptop!

See previous post with more numbers here.

Like many other functions in our global economy, however, the real work has been happening far away, while we sleep. We are outsourcing the scanning of the universal library. Superstar, an entrepreneurial company based in Beijing, has scanned every book from 900 university libraries in China. It has already digitized 1.3 million unique titles in Chinese, which it estimates is about half of all the books published in the Chinese language since 1949. It costs $30 to scan a book at Stanford but only $10 in China.

Raj Reddy, a professor at Carnegie Mellon University, decided to move a fair-size English-language library to where the cheap subsidized scanners were. In 2004, he borrowed 30,000 volumes from the storage rooms of the Carnegie Mellon library and the Carnegie Library and packed them off to China in a single shipping container to be scanned by an assembly line of workers paid by the Chinese. His project, which he calls the Million Book Project, is churning out 100,000 pages per day at 20 scanning stations in India and China. Reddy hopes to reach a million digitized books in two years.

The idea is to seed the bookless developing world with easily available texts. Superstar sells copies of books it scans back to the same university libraries it scans from. A university can expand a typical 60,000-volume library into a 1.3 million-volume one overnight. At about 50 cents per digital book acquired, it's a cheap way for a library to increase its collection. Bill McCoy, the general manager of Adobe's e-publishing business, says: "Some of us have thousands of books at home, can walk to wonderful big-box bookstores and well-stocked libraries and can get Amazon.com to deliver next day. The most dramatic effect of digital libraries will be not on us, the well-booked, but on the billions of people worldwide who are underserved by ordinary paper books." It is these underbooked — students in Mali, scientists in Kazakhstan, elderly people in Peru — whose lives will be transformed when even the simplest unadorned version of the universal library is placed in their hands.

...Just as a Web article on, say, aquariums, can have some of its words linked to definitions of fish terms, any and all words in a digitized book can be hyperlinked to other parts of other books. Books, including fiction, will become a web of names and a community of ideas.

Search engines are transforming our culture because they harness the power of relationships, which is all links really are. There are about 100 billion Web pages, and each page holds, on average, 10 links. That's a trillion electrified connections coursing through the Web. This tangle of relationships is precisely what gives the Web its immense force. The static world of book knowledge is about to be transformed by the same elevation of relationships, as each page in a book discovers other pages and other books. Once text is digital, books seep out of their bindings and weave themselves together. The collective intelligence of a library allows us to see things we can't see in a single, isolated book.

When books are digitized, reading becomes a community activity. Bookmarks can be shared with fellow readers. Marginalia can be broadcast. Bibliographies swapped. You might get an alert that your friend Carl has annotated a favorite book of yours. A moment later, his links are yours. In a curious way, the universal library becomes one very, very, very large single text: the world's only book.

...So what happens when all the books in the world become a single liquid fabric of interconnected words and ideas? Four things: First, works on the margins of popularity will find a small audience larger than the near-zero audience they usually have now. Far out in the "long tail" of the distribution curve — that extended place of low-to-no sales where most of the books in the world live — digital interlinking will lift the readership of almost any title, no matter how esoteric. Second, the universal library will deepen our grasp of history, as every original document in the course of civilization is scanned and cross-linked. Third, the universal library of all books will cultivate a new sense of authority. If you can truly incorporate all texts — past and present, multilingual — on a particular subject, then you can have a clearer sense of what we as a civilization, a species, do know and don't know. The white spaces of our collective ignorance are highlighted, while the golden peaks of our knowledge are drawn with completeness. This degree of authority is only rarely achieved in scholarship today, but it will become routine.

Finally, the full, complete universal library of all works becomes more than just a better Ask Jeeves. Search on the Web becomes a new infrastructure for entirely new functions and services. Right now, if you mash up Google Maps and Monster.com, you get maps of where jobs are located by salary. In the same way, it is easy to see that in the great library, everything that has ever been written about, for example, Trafalgar Square in London could be present on that spot via a screen. In the same way, every object, event or location on earth would "know" everything that has ever been written about it in any book, in any language, at any time. From this deep structuring of knowledge comes a new culture of interaction and participation.

The main drawback of this vision is a big one. So far, the universal library lacks books. Despite the best efforts of bloggers and the creators of the Wikipedia, most of the world's expertise still resides in books. And a universal library without the contents of books is no universal library at all.

There are dozens of excellent reasons that books should quickly be made part of the emerging Web. But so far they have not been, at least not in great numbers. And there is only one reason: the hegemony of the copy.

Wednesday, May 10, 2006

No US topcoders?

TopCoder is a global programming competition with thousands of competitors worldwide. If I recall correctly, in the early rounds you qualify by taking online tests, then advance to regional and global finals. Any competition like this is just a crude evaluator of talent (although I'm sure anyone who can win TopCoder is very talented), and perhaps not entirely predictive of real-world performance. Nevertheless, it is alarming how poorly Americans are doing at the competition. There were similarly poor results in the recent ACM collegiate programming championships, in which no US team made the top 10.

The two Americans mentioned in the WSJ article below are both Caltech guys! (The older brother just graduated, I think.)

Cause for Concern? Americans Are Scarce
In Top Tech Contest
May 10, 2006

The results have been carefully tabulated by a computer and, thus, are beyond dispute: Of the 48 best computer programmers in the world, only four of them are Americans. But what that bit of data says about the state of the U.S. education system is open to debate.

Back in February, I wrote about a computer-programming competition run by an outfit called TopCoder. That event was part of the run-up to the global finals held last week in Las Vegas. If you have trouble putting "computer programming" and "spectator sport" in the same sentence, you haven't been to one of these contests. From the gasps, moans and cheers as the audience watched the scoreboard tracking the contestants, you'd have thought you were at a World Cup match.

As noted in February, these competitions were dominated at their start in 2001 by Americans, but that's no longer the case -- not by a long shot. In fact, of the four Americans who won the top seats out of 4,500 contestants, two were brothers: Po-Shen Loh, 23, a graduate student in math at Princeton University, and his 21-year-old sibling, Po-Ru, now an undergraduate at CalTech. Both were born in the Midwest of parents who had emigrated to the U.S. from Singapore; their father is a professor of statistics at the University of Wisconsin at Madison.

By contrast, there were eight from Russia, and four each from Norway and China. The biggest delegation -- 11 -- came from Poland.

So, is all this more evidence of a sad decline in American education and competitiveness?

Surprisingly, the Eastern Europeans don't seem to think so. Poland's Krzysztof Duleba, 22, explained that in countries like his own, there are so few economic opportunities for students that competitions like these are their one chance to participate in the global economy. Some of the Eastern Europeans even seemed slightly embarrassed by their over-representation, saying it isn't evidence of any superior schooling or talent so much as an indicator of how much they have to prove.

Much of Poland's abundant interest in coding contests can be traced to Tomasz Czajka, who as a multiple TopCoder champion has won more than $100,000 in prize money since the competition began. That has made him something of a national hero back home, and other students have been eager to follow suit.

Each round of competition had three problems: easy, medium and hard. The hard problem of the final round required contestants to figure out the most efficient way of using computer cable to connect different nodes in a network.

Naturally, the actual problem was vastly more complicated than that description makes it seem. John Dethridge, an Australian contestant, said the average computer-science undergraduate might not be able to solve that third problem at all, much less do so in the 90 minutes the contestants had to tackle all three.

The final round involved eight contestants culled during the first two days of competition. None of the Americans made the final cut; instead, there were two Russians, two Poles, and coders from Australia, China, Japan and Slovakia. One of the Russians, Petr Mitrichev, 21, won, taking home $20,000 for his efforts.

Others attending the contest cautioned against reaching any sky-is-falling conclusions about the relative lack of success of Americans.

Ken Vogel, a former TopCoder contestant who was at the event recruiting for his current employer, UBS, noted that in the real world, programmers need many other skills in addition to the ability to solve quickly some discrete and entirely artificial problems. These include, he said, thinking about the big picture, working well in teams, and anticipating the sorts of things that users of computers and computer software might actually want.

It's not at all clear that any of the famous U.S. technology entrepreneurs of the past several decades would have done particularly well at such a contest.

Still, when contemplating how out of place some of the strongly disciplined Russian or Polish programmers would be among American college students, who all too often become either slackers or salary-obsessed careerists out for the easy score, it's hard not to be depressed.

American contestant Po-Shen Loh recalled recently happening upon an afternoon TV cartoon aimed at toddlers, in which a stereotypically brainy student was being teased by his classmates. "They were making fun of the smart one," he sighed. "If this is what American kids are watching even before they know any better, it can't help but affect them later on."

The TopCoder company pays for these events in part by attracting sponsors who pony up for the privilege of recruiting from among the contestants. One of the sponsors was the National Security Agency, which, as keeper of America's state secrets, isn't allowed to hire noncitizens. That makes it one of the few employers anywhere who can't participate in the globalization of the computer industry that the contest represents.

The other sponsors, however, were all smiles.

Saturday, May 06, 2006

Social mobility

Andrew Hacker reviews several new books on class and inequality in the NY Review of Books. The table below summarizes both upward and downward mobility from a recent study. I've seen claims that social mobility in the US has decreased, and is now less than in some european countries (particularly Scandinavian countries in which university education is free). Nevertheless, you can see that about a third of kids from the top or bottom quintile will jump by two or more quintiles in their lives (the poor kids moving up and the rich kids falling from affluence). Furthermore, the distributions are relatively symmetric.



(BTW, the article by physicist Jeremy Bernstein on nuclear weapons and intelligence failures in the same issue is very good.)

Wednesday, May 03, 2006

Crypto and car theft

Ever wonder what kind of handshake is going on between your luxury car and the keyless entry key fob? It seems that this is a solvable problem, although it depends a bit on how much CPU power the fob has.

Gone in 20 minutes

High-tech thieves are becoming increasingly savvy when it comes to stealing automobiles equipped with keyless entry and ignition systems. While many computer-based security systems on automobiles require some type of key — mechanical or otherwise — to start the engine, so-called ‘keyless’ setups require only the presence of a key fob to start the engine.

The expert gang suspected of stealing two of David Beckham’s BMW X5 SUVs in the last six months did so by using software programs on a laptop to wirelessly break into the car’s computer, open the doors, and start the engine.

“It’s difficult to steal cars with complex security, but not impossible. There are weaknesses in any system,” Tim Hart of the Auto Locksmith Association told the U.K.’s Auto Express magazine. “At key steps the car’s software can halt progress for up to 20 minutes as part of its in-built protection,” said Hart.

Because the decryption process can take a while — up to 20 minutes, according to Hart — the thieves usually wait to find the car in a secluded area where it will be left for a long period. That is believed to be what happened to Mr. Beckham — the crooks followed him to the mall where he was to have lunch, and went to work on his X5 after it was parked.

While automakers and locksmiths are supposed to be the only groups that know where and how security information is stored in a car, the information eventually falls into the wrong hands.

...The Leftlane Perspective: Many modern cars now rely on software entirely for security. Gone are the days where microchips supplemented mechanical locks as an additional security measure. In the case of true ‘keyless’ systems, software is the only thing between a thief and your car. As computers become more powerful, will stealing cars become even easier? Never mind future cars with better security — what about today’s cars a few years down the road? With cars as inexpensive as the Toyota Camry offering entirely keyless systems, these concerns a relevant to all consumers.

The sad state of US particle physics

The Economist summarizes the sorry state of hep-ex in the US. Even in the optimistic scenario, in which the ILC is sited here, America will be a backwater for particle experiment for a decade to come. Does anyone other than a few particle physicists care? The NRC panel had to recruit economist and former Princeton president Harold Shapiro to lend gravitas to the report. What does Shapiro know about fundamental science or technology? What does this say about the prestige of physics today relative to its glory days 50 years ago?

A report on the state of fundamental physics in America

NEAR Waxahachie in Texas, there is a hole in the ground. Not just any old hole. This one is almost 23km long and curves in what would be, if it were extended, a circular loop. It is the site of what was intended to be the world's biggest and best particle accelerator, a machine capable of unlocking some of the fundamental secrets of nature itself. Ever since the project to build it was cancelled in 1993, after nearly $2 billion had been spent on construction, America's lead in particle physics has been shrinking. This week, a report by the country's National Research Council (NRC) outlined what America can do to regain its pre-eminence.

The outlook is grim. After decades of making discoveries about the fundamental building blocks of nature, America's particle-physics colliders are to close. The Tevatron at Fermilab, near Chicago, is the world's highest-energy particle-smasher. But that honour will be wrenched from it next year, when the Large Hadron Collider (LHC) opens for business at CERN, the European particle-physics laboratory near Geneva. The Tevatron is scheduled to shut by the end of the decade, and the LHC is expected to dominate international particle physics for the next 15 years.

America's other accelerators are in trouble, too. Work at the Stanford Linear Accelerator Centre is moving away from particle physics and into generating high-energy X-rays. Funding for the Relativistic Heavy Ion Collider at the Brookhaven National Laboratory is so tight that the machine managed to keep running only after a philanthropist intervened. Many American particle physicists have switched their attentions to the LHC. And while physicists dream of shiny new machines, none is scheduled to be built in America. In the review of its top ten research advances of 2005, Science singled out American particle physics for the booby prize.

This is a shame. While particle physics does not produce economic returns in the lifetimes of the administrations that fund the experiments, it is a fascinating endeavour. Over the past 100 years, physicists have succeeded in identifying the fundamental building blocks of the universe, both in terms of matter and of the forces that act on it. Yet the so-called Standard Model of particle physics, which weaves these discoveries together, and which has proved so successful to date, is incomplete. If physicists are to improve on it, they need machines that supply particles at higher energies to probe the nature of space and time. Such machines could provide answers about how the universe began and how it will evolve.

In this light, the NRC report makes interesting if somewhat biased reading. It wants America to retake the lead in particle physics, and to host the next new particle accelerator to be built after the LHC. A decision on whether to build this machine, dubbed the International Linear Collider, will not be taken until the initial results from the LHC are known. If these point to more fundamental physics beyond the Standard Model, as is widely expected, then there would be mileage in building a more powerful accelerator to study these phenomena. Such a decision would not be made until 2010, but America's physicists are keen for their country to host such a facility.

To do so, the NRC reckons, America will need to spend between 2% and 3% more each year in real terms on particle physics. That might not sound onerous, but it amounts to $500m over the next five years.

European particle physicists are also keen to lead the world. A similar panel of experts is close to finalising a European strategy. The CERN Council Strategy Group, as it is called, will meet near Berlin next week to hammer out a consensus. Then, in July, this consensus will be presented to the politicians who decide the funding of particle physics.

At present, Europe is poised to take the lead: it will soon have the world's biggest and best particle accelerator. Whether CERN could afford to host the International Linear Collider, though, is a moot point. The laboratory has had to take out loans to build the LHC, and it is likely that any future money would be spent on upgrading that machine.

Then there is Japan, which has a promising programme in neutrino physics, another area where the report urges America to push forwards. Japan recently lost the fight to host the International Thermonuclear Experimental Reactor, an expensive international project which aims to demonstrate that nuclear fusion could produce power on a commercial basis, so it feels it is owed something. And, depending on the strengths of their economies when the time comes to make the decision, Russia, China and India could all be interested, too.

In any event, American particle physics looks set for a period in the wilderness. Whether the country can recapture its superiority after such a spell depends on whether there is the political will to pour more money into holes in the ground.

Blog Archive

Labels