Friday, October 30, 2009

Selectivity of US colleges and returns to elite education

Below is some work on US college admissions by economist Caroline Hoxby. There has been a recent increase in the number of American students who are interested in attending an elite university, possibly far from home. This accounts for almost all of the increased competitiveness of college admissions, and the effect is confined to only a small subset of elite schools. Selectivity at typical state universities has not increased.

These observations are consistent with my own experience. Growing up in Iowa, I remember many academically talented students from my high school (often children of professors) simply attending the local state university by default. Today, those same kids would be angling for admission to the Ivy league, Stanford, MIT, Caltech, etc.

However, it's not clear what caused this increased interest in elite education...

Near the end of the paper Hoxby reviews the literature on the returns to elite higher education. Her estimates of the rate of return are on the high side (definitely worthwhile, in her Harvard summa and Rhodes opinion ;-), and she casts some doubt on the Dale and Krueger study which compared elite graduates against others who were accepted to an elite school but chose not to attend. Dale and Krueger concluded that the choice of college had little impact on future earnings, but Hoxby suggests there was something "odd" about the 1 in 10 kids who declined elite admission :-)

See related posts returns to elite education , higher education and human capital.

Note to efficient market / rational agent / Chicago School believers: apparently even leading economists cannot agree on the size of returns to elite education. The economists, who are able to look backward in time and use massive statistics, have a much easier task than an individual student, who has to look forward 30 years to decide which choice is better. (Sound like projecting the cash flows from a bundle of mortgages over the next 30 years?) Therefore, it is hard to believe that individuals are clever enough to act like "rational agents" in solving this problem, or making any other life decision.

The Changing Selectivity of American Colleges

Caroline M. Hoxby
NBER Working Paper No. 15446*
Issued in October 2009

This paper shows that although the top ten percent of colleges are substantially more selective now than they were 5 decades ago, most colleges are not more selective. Moreover, at least 50 percent of colleges are substantially less selective now than they were then. This paper demonstrates that competition for space--the number of students who wish to attend college growing faster than the number of spaces available--does not explain changing selectivity. The explanation is, instead, that the elasticity of a student's preference for a college with respect to its proximity to his home has fallen substantially over time and there has been a corresponding increase in the elasticity of his preference for a college with respect to its resources and peers. In other words, students used to attend a local college regardless of their abilities and its characteristics. Now, their choices are driven far less by distance and far more by a college's resources and student body. It is the consequent re-sorting of students among colleges that has, at once, caused selectivity to rise in a small number of colleges while simultaneously causing it to fall in other colleges. I show that the integration of the market for college education has had profound implications on the peers whom college students experience, the resources invested in their education, the tuition they pay, and the subsidies they enjoy. An important finding is that, even though tuition has been rising rapidly at the most selective schools, the deal students get there has arguably improved greatly. The result is that the "stakes" associated with admission to these colleges are much higher now than in the past.

More here from the Atlantic Monthly:

... The researchers Alan Krueger and Stacy Berg Dale began investigating this question, and in 1999 produced a study that dropped a bomb on the notion of elite-college attendance as essential to success later in life. Krueger, a Princeton economist, and Dale, affiliated with the Andrew Mellon Foundation, began by comparing students who entered Ivy League and similar schools in 1976 with students who entered less prestigious colleges the same year. They found, for instance, that by 1995 Yale graduates were earning 30 percent more than Tulane graduates, which seemed to support the assumption that attending an elite college smoothes one's path in life.

But maybe the kids who got into Yale were simply more talented or hardworking than those who got into Tulane. To adjust for this, Krueger and Dale studied what happened to students who were accepted at an Ivy or a similar institution, but chose instead to attend a less sexy, "moderately selective" school. It turned out that such students had, on average, the same income twenty years later as graduates of the elite colleges. Krueger and Dale found that for students bright enough to win admission to a top school, later income "varied little, no matter which type of college they attended." In other words, the student, not the school, was responsible for the success.

... Which brings us back to the Krueger-Dale thesis. Can we really be sure Hamilton is nearly as good as Harvard?

Some analysts maintain that there are indeed significant advantages to the most selective schools. For instance, a study by Caroline Hoxby, a Harvard economist who has researched college outcomes, suggests that graduates of elite schools do earn more than those of comparable ability who attended other colleges. Hoxby studied male students who entered college in 1982, and adjusted for aptitude, though she used criteria different from those employed by Krueger and Dale. She projected that among students of similar aptitude, those who attended the most selective colleges would earn an average of $2.9 million during their careers; those who attended the next most selective colleges would earn $2.8 million; and those who attended all other colleges would average $2.5 million. This helped convince Hoxby that top applicants should, in fact, lust after the most exclusive possibilities.

"There's a clear benefit to the top fifty or so colleges," she says. "Connections made at the top schools matter. It's not so much that you meet the son of a wealthy banker and his father offers you a job, but that you meet specialists and experts who are on campus for conferences and speeches. The conference networking scene is much better at the elite universities." Hoxby estimates that about three quarters of the educational benefit a student receives is determined by his or her effort and abilities, and should be more or less the same at any good college. The remaining quarter, she thinks, is determined by the status of the school—higher-status schools have more resources and better networking opportunities, and surround top students with other top students.

"Today there are large numbers of colleges with good faculty, so faculty probably isn't the explanation for the advantage at the top," Hoxby says. "Probably there is not much difference between the quality of the faculty at Princeton and at Rutgers. But there's a lot of difference between the students at those places, and some of every person's education comes from interaction with other students." Being in a super-competitive environment may cause a few students to have nervous breakdowns, but many do their best work under pressure, and the contest is keenest at the Gotta-Get-Ins. Hoxby notes that some medium-rated public universities have established internal "honors colleges" to attract top performers who might qualify for the best destinations. "Students at honors colleges in the public universities do okay, but not as well as they would do at the elite schools," Hoxby argues. The reason, she feels, is that they're not surrounded by other top-performing students.

Tuesday, October 27, 2009

Do flu vaccines work?

The more I learn about the quality standards of medical research, the less I believe what my doctor tells me! Notice how hard it is, even in a "scientific" context to oppose the conventional wisdom.

Related posts: bounded cognition, Bouchard against group think, the future of innovation.

More from Gary Taubes on diet and nutrition issues, and WIRED on placebo mysteries.

As a postdoc I briefly dated a woman who was doing graduate work at Harvard's school of public health. I was a bit surprised at her low opinion of the scientific and statistical prowess of medical doctors, at least until I thought back to all the pre-med students I had taught while in grad school ;-) She pointed out to me that most MD's (as opposed to MD/PhD's) aren't scientists at all -- they simply apply what they're taught in medical school.

Atlantic Monthly: ... When Lisa Jackson, a physician and senior investigator with the Group Health Research Center, in Seattle, began wondering aloud to colleagues if maybe something was amiss with the estimate of 50 percent mortality reduction for people who get flu vaccine, the response she got sounded more like doctrine than science. “People told me, ‘No good can come of [asking] this,’” she says. “‘Potentially a lot of bad could happen’ for me professionally by raising any criticism that might dissuade people from getting vaccinated, because of course, ‘We know that vaccine works.’ This was the prevailing wisdom.”

Nonetheless, in 2004, Jackson and three colleagues set out to determine whether the mortality difference between the vaccinated and the unvaccinated might be caused by a phenomenon known as the “healthy user effect.” They hypothesized that on average, people who get vaccinated are simply healthier than those who don’t, and thus less liable to die over the short term. People who don’t get vaccinated may be bedridden or otherwise too sick to go get a shot. They may also be more likely to succumb to flu or any other illness, because they are generally older and sicker. To test their thesis, Jackson and her colleagues combed through eight years of medical data on more than 72,000 people 65 and older. They looked at who got flu shots and who didn’t. Then they examined which group’s members were more likely to die of any cause when it was not flu season.

Jackson’s findings showed that outside of flu season, the baseline risk of death among people who did not get vaccinated was approximately 60 percent higher than among those who did, lending support to the hypothesis that on average, healthy people chose to get the vaccine, while the “frail elderly” didn’t or couldn’t. In fact, the healthy-user effect explained the entire benefit that other researchers were attributing to flu vaccine, suggesting that the vaccine itself might not reduce mortality at all. Jackson’s papers “are beautiful,” says Lone Simonsen, who is a professor of global health at George Washington University, in Washington, D.C., and an internationally recognized expert in influenza and vaccine epidemiology. “They are classic studies in epidemiology, they are so carefully done.”

The results were also so unexpected that many experts simply refused to believe them. Jackson’s papers were turned down for publication in the top-ranked medical journals. One flu expert who reviewed her studies for the Journal of the American Medical Association wrote, “To accept these results would be to say that the earth is flat!” When the papers were finally published in 2006, in the less prominent International Journal of Epidemiology, they were largely ignored by doctors and public-health officials. “The answer I got,” says Jackson, “was not the right answer.”

... THE MOST vocal—and undoubtedly most vexing—critic of the gospel of flu vaccine is the Cochrane Collaboration’s Jefferson, who’s also an epidemiologist trained at the famed London School of Tropical Hygiene, and who, in Lisa Jackson’s view, makes other skeptics seem “moderate by comparison.” Among his fellow flu researchers, Jefferson’s outspokenness has made him something of a pariah. At a 2007 meeting on pandemic preparedness at a hotel in Bethesda, Maryland, Jefferson, who’d been invited to speak at the conference, was not greeted by any of the colleagues milling about the lobby. He ate his meals in the hotel restaurant alone, surrounded by scientists chatting amiably at other tables. He shrugs off such treatment. As a medical officer working for the United Nations in 1992, during the siege of Sarajevo, he and other peacekeepers were captured and held for more than a month by militiamen brandishing AK-47s and reeking of alcohol. Professional shunning seems trivial by comparison, he says.

“Tom Jefferson has taken a lot of heat just for saying, ‘Here’s the evidence: it’s not very good,’” says Majumdar. “The reaction has been so dogmatic and even hysterical that you’d think he was advocating stealing babies.” Yet while other flu researchers may not like what Jefferson has to say, they cannot ignore the fact that he knows the flu-vaccine literature better than anyone else on the planet. He leads an international team of researchers who have combed through hundreds of flu-vaccine studies. The vast majority of the studies were deeply flawed, says Jefferson. “Rubbish is not a scientific term, but I think it’s the term that applies.” Only four studies were properly designed to pin down the effectiveness of flu vaccine, he says, and two of those showed that it might be effective in certain groups of patients, such as school-age children with no underlying health issues like asthma. The other two showed equivocal results or no benefit.

... In the flu-vaccine world, Jefferson’s call for placebo-controlled studies is considered so radical that even some of his fellow skeptics oppose it.

Saturday, October 24, 2009

Eric Baum: What is Thought?

Last week we had AI researcher and former physicist Eric Baum here as our colloquium speaker. (See here for an 11 minute video of a similar, but shorter, talk he gave at the 2008 Singularity Summit.)

Here's what I wrote about Baum and his book What is Thought back in 2008:

My favorite book on AI is Eric Baum's What is Thought? (Google books version). Baum (former theoretical physicist retooled as computer scientist) notes that evolution has compressed a huge amount of information in the structure of our brains (and genes), a process that AI would have to somehow replicate. A very crude estimate of the amount of computational power used by nature in this process leads to a pessimistic prognosis for AI even if one is willing to extrapolate Moore's Law well into the future. Most naive analyses of AI and computational power only ask what is required to simulate a human brain, but do not ask what is required to evolve one. I would guess that our best hope is to cheat by using what nature has already given us -- emulating the human brain as much as possible.

This perspective seems quite obvious now that I have kids -- their rate of learning about the world is clearly enhanced by pre-evolved capabilities. They're not generalized learning engines -- they're optimized to do things like recognize patterns (e.g., faces), use specific concepts (e.g., integers), communicate using language, etc.

What is Thought?

In What Is Thought? Eric Baum proposes a computational explanation of thought. Just as Erwin Schrodinger in his classic 1944 work What Is Life? argued ten years before the discovery of DNA that life must be explainable at a fundamental level by physics and chemistry, Baum contends that the present-day inability of computer science to explain thought and meaning is no reason to doubt there can be such an explanation. Baum argues that the complexity of mind is the outcome of evolution, which has built thought processes that act unlike the standard algorithms of computer science and that to understand the mind we need to understand these thought processes and the evolutionary process that produced them in computational terms.

Baum proposes that underlying mind is a complex but compact program that exploits the underlying structure of the world. He argues further that the mind is essentially programmed by DNA. We learn more rapidly than computer scientists have so far been able to explain because the DNA code has programmed the mind to deal only with meaningful possibilities. Thus the mind understands by exploiting semantics, or meaning, for the purposes of computation; constraints are built in so that although there are myriad possibilities, only a few make sense. Evolution discovered corresponding subroutines or shortcuts to speed up its processes and to construct creatures whose survival depends on making the right choice quickly. Baum argues that the structure and nature of thought, meaning, sensation, and consciousness therefore arise naturally from the evolution of programs that exploit the compact structure of the world.

When I first looked at What is Thought? I was under the impression that Baum's meaning, underlying structure and compact program were defined in terms of algorithmic complexity. However, it's more complicated than that. While Nature is governed by an algorithmically simple program (the Standard Model Hamiltonian can, after all, be written down on a single sheet of paper) a useful evolved program has to run in a reasonable amount of time, under resource (memory, CPU) constraints that Nature itself does not face. Compressible does not imply tractable -- all of physics might reduce to a compact Theory of Everything, but it probably won't be very useful for designing jet airplanes.

Useful programs have to be efficient in many ways -- algorithmically and computationally. So it's not a tautology that Nature is very compressible, therefore there must exist compact (useful) programs that exploit this compressibility. It's important that there are many intermediate levels of compression (i.e., description -- as in quarks vs molecules vs bulk solids vs people), and computationally effective programs to deal with those levels. I'm not sure what measure is used in computer science to encompass both algorithmic and computational complexity. Baum discusses something called minimum description length, but it's not clear to me exactly how the requirement of effective means of computation is formalized. In the language of physicists, Baum's compact (useful) programs are like effective field theories incorporating the relevant degrees of freedom for a certain problem -- they are not only a compressed model of the phenomena, but also allow simple computations.

Evolution has, using a tremendous amount of computational power, found these programs, and our best hope for AI is to exploit their existence to speed our progress. If Baum is correct, the future may be machine learning guided by human Mechanical Turk workers.

Baum has recently relocated to Berkeley to pursue a startup based on his ideas. (Ah, the excitement! I did the same in 2000 ...) His first project is to develop a world class Go program (no lack of ambition :-), with more practical applications down the road. Best of Luck!

Friday, October 23, 2009

Pessimism of the Intellect, Optimism of the Will

I often get questions about the slogan on my homepage:

Pessimism of the Intellect, Optimism of the Will.

Some recognize that it's a quote from Antonio Gramsci and mistakenly think it means I'm a Marxist.

In fact, the meaning of the slogan is quite simple.

Pessimism of the Intellect means, simply, be a scientist: see the world as it really is, not as you might like it to be. Try to identify and overcome hidden biases or prior assumptions. Always ask yourself: What assumption am I making? What if it is incorrect? How do I know what I know?

In many cases, the correct answer is: I don't know. Never be afraid to admit you don't know.

Optimism of the Will means, have the courage to attempt difficult things. Sometimes, Will can overcome the odds.

If people bring so much courage to this world the world has to kill them to break them, so of course it kills them. The world breaks everyone and afterward many are strong in the broken places. But those that will not break it kills. It kills the very good and the very gentle and the very brave impartially. If you are none of these you can be sure it will kill you too but there will be no special hurry. --- A Farewell to Arms, Ernest Hemingway

I'll fight them, I'll fight them until I die. --- The Old Man and the Sea, Ernest Hemingway

Thursday, October 22, 2009

Evolution, Design and the Fermi Paradox

What is the time scale for evolution of complex organisms such as ourselves? On Earth complex life evolved in about 5 billion years (5 Gyr), but one can make an argument that we were probably lucky and that the typical time scale T under similar circumstances is much longer.

There is an interesting coincidence at work: 5 Gyr is remarkably close to the 10 Gyr lifetime of main sequence stars (and to the 14 Gyr age of the universe). This is unexpected, as evolution proceeds by molecular processes and natural selection among complex organisms, whereas stellar lifetimes are determined by nuclear physics.

If T were much smaller than 5 Gyr then it would be improbable for evolution to have been so slow on Earth.

It seems more plausible that T is much larger than 5 Gyr, in which case we were lucky, in a sense I will explain. Inflationary cosmology predicts a very large universe (much larger than what is currently visible to us), so that complex life is likely to exist somewhere in the universe. Conditioning on our own existence (a use of the weak anthropic principle), we should not be surprised to find ourselves lucky -- the few Earth-like planets that manage to evolve life must do so before their suns die. Intelligent beings, while not likely to evolve on any particular Earth-like planet, are likely to observe an evolutionary history that took place over a fraction of 10 Gyr.

Why should T be so large? At present we are unable to make quantitative estimates for the rate of evolution from first principles. It is entirely possible that certain evolutionary steps were highly improbable, such as the appearance of the first self-replicating complex molecules. One can also imagine abstract fitness surfaces with local maxima that trap the system for exponentially long periods of time.

I would not be surprised to find that T is exponentially larger than 5 Gyr. Godel went so far as to propose: "... a mathematical theorem to the effect that the formation within geological times of a human body by the laws of physics (or any other law of a similar nature) starting from a random distribution of the elementary particles and the field, is about as unlikely as the separation by chance of the atmosphere into its components."

The framework described above makes the following predictions:

1. The overwhelming majority of Earth-like planets are devoid of life, thereby resolving the Fermi Paradox.

2. Improved understanding of evolution will uncover highly improbable steps -- that is, improbable even over billions (or perhaps 10^100 !) of years. The fact that life on Earth climbed these steps might suggest intelligent design or divine intervention, but is better explained by the anthropic principle.

See related post evolutionary time scales.

Note added: This idea came to me after reading some discussion of ID and the question of improbable steps in evolution. (Here improbable means, for example, that even in an Earth-size population the multiple simultaneous mutations required jump across a particular fitness valley are unlikely to occur in 5 Gyr given the known mutation rate.) It occurred to me that under certain assumptions what might appear to be ID could actually be due to selection bias -- not taking into account the possibility that complex life is rare even on Earth-like planets. It turns out that the idea is not new -- it goes all the way back to Brandon Carter's 1983 paper which (I believe) coined the term "anthropic principle"! See John D. Barrow and Frank J. Tipler, The Anthropic Cosmological Principle, p. 557 for more. Also see this 1998 paper by Robin Hanson (thanks to a commenter for pointing it out). If one assumes many improbable required steps, one can deduce an upper estimate on the remaining time over which favorable conditions can persist on Earth. Note it appears the remaining lifetime of the sun is less than 1 Gyr, so the coincidence is tighter than I suggested in the original post. As the number of required improbable steps increases the likelihood that we would evolve just before time runs out (favorable conditions end) becomes very high -- this is quantified in the two references above.

The point which I have (still) not seen discussed much is that biologists need not be so defensive about improbable steps in evolution. I sense an almost reflexive -- i.e., prior-driven -- response to any claims of improbability. (On the other hand, perhaps biologists just know more about the details: the claim would be that the historical record does not resemble a typical one that would be generated by a chain of improbable events. I think this question requires further study.) But the main takeaway from this analysis is that improbability does not imply design or intervention.

Tuesday, October 20, 2009

Affirmative action: the numbers

Princeton sociologist Thomas Espenshade has a new book on elite college admissions. He is an author of an earlier study which quantified the advantages and disadvantages conferred by ethnic group at virtually all universities (with only a few notable exceptions such as Caltech). Already, some results from the analysis in the book have leaked out, and the numbers are not pretty -- see below.

US News: ... Translating the advantages into SAT scores, study author Thomas Espenshade, a Princeton sociologist, calculated that African-Americans who achieved 1150 scores on the two original SAT tests had the same chances of getting accepted to top private colleges in 1997 as whites who scored 1460s and Asians who scored perfect 1600s.

Espenshade found that when comparing applicants with similar grades, scores, athletic qualifications, and family history for seven elite private colleges and universities:

Whites were three times as likely to get fat envelopes as Asians. Hispanics were twice as likely to win admission as whites. African-Americans were at least five times as likely to be accepted as whites.

To clarify, 1150 is similar to the average at a typical state university, 1460 similar to the average at a top Ivy, and 1600 is obtained by only a few thousand high school seniors each year (note, this is post-1995 re-centering of the SAT). Cognitively, these are very different groups of kids.

Related posts: Asian-Americans hurt by affirmative action, the ugly truth, the price of admission, Asians at Berkeley, the new white flight.

See these slides for more results. The bottom slide below suggests that if race-blind admissions were enacted the percentage of Asian students at elite universities would jump from 24 percent to 39 percent. Note that 39 percent is similar to the current Asian populations at Caltech and Berkeley, two elite institutions with (roughly) race-blind admissions; the former due to meritocratic idealism, the latter thanks to Proposition 209.

(Click for larger versions.)



Universe Splitter -- iPhone app


New! For your iPhone :-)

Thanks to a correspondent who runs a quant hedge fund and who once worked on phase transitions in the early universe ;-)

Universe Splitter

Tough decision? There's no need to choose — Split the universe and do both!

Scientists say that every quantum event plays out simultaneously in every possible way, with each possibility becoming real in a separate universe. You can now harness this powerful and mysterious effect right from your iPhone or iPod Touch!

How? Whenever you're faced with a choice — for example, whether to accept a job offer or to turn it down — just type both of these actions into Universe Splitter©, and press the button.

Universe Splitter© will immediately contact a laboratory in Geneva, Switzerland, and connect to a Quantis brand quantum device, which releases single photons into a partially-silvered mirror. Each photon will simultaneously bounce off the mirror and pass through it — but in separate universes.*

Within seconds, Universe Splitter© will receive the experiment's result and tell you which of the two universes you're in, and therefore which action to take. Think of it — two entire universes, complete with every last planet and galaxy, and in one, a version of you who took the new job, and in the other, a version of you who didn't!*

Monday, October 19, 2009

Posner: How I became a Keynesian

Somehow I missed this! Thanks to a reader for pointing it out to me.

Posner was as captured by Chicago School nonsense as anyone else, but at least we learn that he can perform a Bayesian update (i.e., learn from reality) -- posteriors need not be wholly determined by priors :-)

Strangely, I don't see any discussion of this article on the Becker-Posner blog. How do Gary Becker and Robert Lucas feel about the recent apostasy of their colleague?

How I Became a Keynesian, by Richard Posner

... I had never thought to read The General Theory of Employment, Interest, and Money, despite my interest in economics.

... We have learned since September that the present generation of economists has not figured out how the economy works. The vast majority of them were blindsided by the housing bubble and the ensuing banking crisis; and misjudged the gravity of the economic downturn that resulted; and were perplexed by the inability of orthodox monetary policy administered by the Federal Reserve to prevent such a steep downturn; and could not agree on what, if anything, the government should do to halt it and put the economy on the road to recovery. By now a majority of economists are in general agreement with the Obama administration's exceedingly Keynesian strategy for digging the economy out of its deep hole.

... The dominant conception of economics today, and one that has guided my own academic work in the economics of law, is that economics is the study of rational choice. People are assumed to make rational decisions across the entire range of human choice, including but not limited to market transactions, by employing a form (usually truncated and informal) of cost-benefit analysis. The older view was that economics is the study of the economy, employing whatever assumptions seem realistic and whatever analytical methods come to hand. Keynes wanted to be realistic about decision-making rather than explore how far an economist could get by assuming that people really do base decisions on some approximation to cost-benefit analysis.

... It is an especially difficult read for present-day academic economists, because it is based on a conception of economics remote from theirs. This is what made the book seem "outdated" to Mankiw--and has made it, indeed, a largely unread classic. (Another very distinguished macroeconomist, Robert Lucas, writing a few years after Mankiw, dismissed The General Theory as "an ideological event.") The dominant conception of economics today, and one that has guided my own academic work in the economics of law, is that economics is the study of rational choice. People are assumed to make rational decisions across the entire range of human choice, including but not limited to market transactions, by employing a form (usually truncated and informal) of cost-benefit analysis. The older view was that economics is the study of the economy, employing whatever assumptions seem realistic and whatever analytical methods come to hand. Keynes wanted to be realistic about decision-making rather than explore how far an economist could get by assuming that people really do base decisions on some approximation to cost-benefit analysis.

The General Theory is full of interesting psychological observations--the word "psychological" is ubiquitous--as when Keynes notes that "during a boom the popular estimation of [risk] is apt to become unusually and imprudently low," while during a bust the "animal spirits" of entrepreneurs droop. He uses such insights without trying to fit them into a model of rational decision-making.

An eclectic approach to economic behavior came naturally to Keynes, because he was not an academic economist in the modern sense. He had no degree in economics, and wrote extensively in other fields (such as probability theory--on which he wrote a treatise that does not mention economics). He combined a fellowship at Cambridge with extensive government service as an adviser and high-level civil servant, and was an active speculator, polemicist, and journalist. He lived in the company of writers and was an ardent balletomane.

... The third claim that I am calling foundational for Keynes's theory--that the business environment is marked by uncertainty in the sense of risk that cannot be calculated--now enters the picture. Savers do not direct how their savings will be used by entrepreneurs; entrepreneurs do, guided by the hope of making profits. But when an investment project will take years to complete before it begins to generate a profit, its prospects for success will be shadowed by all sorts of unpredictable contingencies, having to do with costs, consumer preferences, actions by competitors, government policy, and economic conditions generally. Skidelsky puts this well in his new book: "An unmanaged capitalist economy is inherently unstable. Neither profit expectations nor the rate of interest are solidly anchored in the underlying forces of productivity and thrift. They are driven by uncertain and fluctuating expectations about the future." Only what Keynes called "animal spirits," or the "urge to action," will persuade businessmen to embark on such a sea of uncertainty. "If human nature felt no temptation to take a chance, no satisfaction (profit apart) in constructing a factory, a railway, a mine or a farm, there might not be much investment merely as a result of cold calculation."

But however high-spirited a businessman may be, often the uncertainty of the business environment will make him reluctant to invest. His reluctance will be all the greater if savers are hesitant to part with their money because of their own uncertainties about future interest rates, default risks, and possible emergency needs for cash to pay off debts or to meet unexpected expenses. The greater the propensity to hoard, the higher the interest rate that a businessman will have to pay for the capital that he requires for investment. And since interest expense is greater the longer a loan is outstanding, a high interest rate will have an especially dampening effect on projects that, being intended to meet consumption needs beyond the immediate future, take a long time to complete.

... An ambitious public-works program can be a confidence builder. It shows that government means (to help) business. "The return of confidence," Keynes explains, "is the aspect of the slump which bankers and businessmen have been right in emphasizing, and which the economists who have put their faith in a ‘purely monetary' remedy have underestimated." In a possible gesture toward Roosevelt's first inaugural ("we have nothing to fear but fear itself"), Keynes remarks upon "the uncontrollable and disobedient psychology of the business world."

See also my talk (for physicists) on the financial crisis. Some related posts on Keynes. Even more on Keynes (12th Wrangler) here.

Sunday, October 18, 2009

Efficient Frontier and William Bernstein

Last week we had polymath William Bernstein (finance thinker, author, portfolio manager, former neurologist and Berkeley PhD in chemistry) on campus for a seminar. I'd been a reader of his site Efficient Frontier for many years, and when I learned by chance that he lives in Oregon, invited him to visit.

Here is an Amazon link to Bernstein's books -- ranging from practical investing advice to histories of global trade and of the birth of prosperity in the modern world.

Below is an excerpt from the preface of his new book.

... I have come to the sad conclusion that only a tiny minority will ever succeed in managing their money even tolerably well.

Successful investors need four abilities. First, they must possess an interest in the process. It is no different from carpentry, gardening, or parenting. If money management is not enjoyable, then a lousy job inevitably results, and, unfortunately, most people enjoy finance about as much as they do root canal work.

Second, investors need more than a bit of math horsepower, far beyond simple arithmetic and algebra, or even the ability to manipulate a spreadsheet. Mastering the basics of investment theory requires an understanding of the laws of probability and a working knowledge of statistics. Sadly, as one financial columnist explained to me more than a decade ago, fractions are a stretch for 90 percent of the population.

Third, investors need a firm grasp of financial history, from the South Sea Bubble to the Great Depression. Alas, as we shall soon see, this is something that even professionals have real trouble with.

Even if investors possess all three of these abilities, it will all be for naught if they do not have a fourth one: the emotional discipline to execute their planned strategy faithfully, come hell, high water, or the apparent end of capitalism as we know it. “ Stay the course ” : It sounds so easy when uttered at high tide. Unfortunately, when the water recedes, it is not. I expect no more than 10 percent of the population passes muster on each of the above counts. This suggests that as few as one person in ten thousand (10 percent to the fourth power) has the full skill set. Perhaps I am being overly pessimistic. After all, these four abilities may not be entirely independent: if someone is smart enough, it is also more likely he or she will be interested in finance and be driven to delve into financial history.

But even the most optimistic assumptions — increase the odds at any of the four steps to 30 percent and link them -- suggests that no more than a few percent of the population is qualified to manage their own money. And even with the requisite skill set, more than a little moxie is involved. This last requirement — the ability to deploy what legendary investor Charley Ellis calls “ the emotional game ” — is completely independent of the other three; Wall Street is littered with the bones of those who knew just what to do, but could not bring themselves to do it.

It's a sobering thought that so few people make competent money managers, given that most Americans now manage their own retirement accounts. During dinner, we refined his estimate (taking into account correlation) of the fraction of competent investors to about 1 in 1000.

Although I had thought about this question in the past, and even identified the various factors, it hadn't occurred to me until dinner with Bernstein that this was a good reason to suggest finance as a career to someone who has the right interests (history, finance theory, markets -- relatively easily acquired, as these subjects are fascinating), personality factors (discipline, controlled risk taking, decisiveness -- not so easily acquired, but can be improved over time) and intelligence (not easily acquired, but perhaps the threshold isn't that high at 90th percentile). No one factor is particularly rare, but the combination certainly is. As long as the signal to noise ratio in a financial career is reasonable, a person with all these qualities can be confident of success. While only a few people will ever run a hedge fund, it would seem that almost everyone could benefit from help managing their money -- if the advisor has all the qualities listed above.

Thursday, October 15, 2009

Calvin Trillin on Wall Street smarts

Calvin Trillin, one of my favorite New Yorker writers and a keen observer of American society, writes in the Times. It's true that the quants built machines (mortgage finance) that were too complex for their masters to operate. Related posts: The new new meme, The best and brightest, The bubble algorithm ...

(This may sound crazy, but does Trillin read this blog?)

NYTimes: Wall Street Smarts

By CALVIN TRILLIN

“IF you really want to know why the financial system nearly collapsed in the fall of 2008, I can tell you in one simple sentence.”

The statement came from a man sitting three or four stools away from me in a sparsely populated Midtown bar, where I was waiting for a friend. “But I have to buy you a drink to hear it?” I asked.

“Absolutely not,” he said. “I can buy my own drinks. My 401(k) is intact. I got out of the market 8 or 10 years ago, when I saw what was happening.”

He did indeed look capable of buying his own drinks — one of which, a dry martini, straight up, was on the bar in front of him. He was a well-preserved, gray-haired man of about retirement age, dressed in the same sort of clothes he must have worn on some Ivy League campus in the late ’50s or early ’60s — a tweed jacket, gray pants, a blue button-down shirt and a club tie that, seen from a distance, seemed adorned with tiny brussels sprouts.

“O.K.,” I said. “Let’s hear it.”

“The financial system nearly collapsed,” he said, “because smart guys had started working on Wall Street.” He took a sip of his martini, and stared straight at the row of bottles behind the bar, as if the conversation was now over.

“But weren’t there smart guys on Wall Street in the first place?” I asked.

He looked at me the way a mathematics teacher might look at a child who, despite heroic efforts by the teacher, seemed incapable of learning the most rudimentary principles of long division. “You are either a lot younger than you look or you don’t have much of a memory,” he said. “One of the speakers at my 25th reunion said that, according to a survey he had done of those attending, income was now precisely in inverse proportion to academic standing in the class, and that was partly because everyone in the lower third of the class had become a Wall Street millionaire.”

I reflected on my own college class, of roughly the same era. The top student had been appointed a federal appeals court judge — earning, by Wall Street standards, tip money. A lot of the people with similarly impressive academic records became professors. I could picture the future titans of Wall Street dozing in the back rows of some gut course like Geology 101, popularly known as Rocks for Jocks.

“That actually sounds more or less accurate,” I said.

“Of course it’s accurate,” he said. “Don’t get me wrong: the guys from the lower third of the class who went to Wall Street had a lot of nice qualities. Most of them were pleasant enough. They made a good impression. And now we realize that by the standards that came later, they weren’t really greedy. They just wanted a nice house in Greenwich and maybe a sailboat. A lot of them were from families that had always been on Wall Street, so they were accustomed to nice houses in Greenwich. They didn’t feel the need to leverage the entire business so they could make the sort of money that easily supports the second oceangoing yacht.”

“So what happened?”

“I told you what happened. Smart guys started going to Wall Street.”

“Why?”

“I thought you’d never ask,” he said, making a practiced gesture with his eyebrows that caused the bartender to get started mixing another martini.

“Two things happened. One is that the amount of money that could be made on Wall Street with hedge fund and private equity operations became just mind-blowing. At the same time, college was getting so expensive that people from reasonably prosperous families were graduating with huge debts. So even the smart guys went to Wall Street, maybe telling themselves that in a few years they’d have so much money they could then become professors or legal-services lawyers or whatever they’d wanted to be in the first place. That’s when you started reading stories about the percentage of the graduating class of Harvard College who planned to go into the financial industry or go to business school so they could then go into the financial industry. That’s when you started reading about these geniuses from M.I.T. and Caltech who instead of going to graduate school in physics went to Wall Street to calculate arbitrage odds.”

“But you still haven’t told me how that brought on the financial crisis.”

“Did you ever hear the word ‘derivatives’?” he said. “Do you think our guys could have invented, say, credit default swaps? Give me a break! They couldn’t have done the math.”

“Why do I get the feeling that there’s one more step in this scenario?” I said.

“Because there is,” he said. “When the smart guys started this business of securitizing things that didn’t even exist in the first place, who was running the firms they worked for? Our guys! The lower third of the class! Guys who didn’t have the foggiest notion of what a credit default swap was. All our guys knew was that they were getting disgustingly rich, and they had gotten to like that. All of that easy money had eaten away at their sense of enoughness.”

“So having smart guys there almost caused Wall Street to collapse.”

“You got it,” he said. “It took you awhile, but you got it.”

The theory sounded too simple to be true, but right offhand I couldn’t find any flaws in it. I found myself contemplating the sort of havoc a horde of smart guys could wreak in other industries. I saw those industries falling one by one, done in by superior intelligence. “I think I need a drink,” I said.

He nodded at my glass and made another one of those eyebrow gestures to the bartender. “Please,” he said. “Allow me.”

I highly recommend Trillin's book Remembering Denny, a meditation on life, success, complexity, the unknowability of others, and fate. Also see Amazon reviews.

Calvin Trillin's "Remembering Denny" is a Cheeveresque rumination on the unfulfilled potential of Trillin's Yale classmate, Denny Hansen. While at Yale, Hansen was so highly thought of that he was profiled in LIFE magazine and his classmates used to kid each other about which cabinet position they'd fill once Hansen had been elected President. After Yale, however, Hansen failed to live up to the high expectations everyone--friends, family, teachers, coaches--had for him. Trillin's book is a delicate examination of what that meant, both for Denny and for his constellation of friends and well-wishers.

Denny doesn't come alive as vividly as might be hoped here, but Trillin does an outstanding job of sketching this young man's life in terms of a larger picture about America. In a country where success on every level is much prized, Trillin subtly but thoroughly plumbs the reasons why Denny didn't succeed--at least not to the extent everyone thought he would. This uncharacteristically somber book is absorbing and thought-provoking, even if it doesn't quite reach the goals Trillin seems to have set for himself in the beginning chapters.

Signals from the market in Guangzhou

This NYTimes article, reported from the world's largest export tradeshow in Guangzhou, China, illustrates that

1. we're past the bottom (at least globally) for the recession

2. the dollar is in trouble, and everybody knows it

3. low-skill labor in China still has little pricing power; manufacturing outside of China is in for more tough times.

NYTimes: Throngs of buyers from around the world swarmed through the world’s largest trade show at its opening Thursday, underscoring how China’s combination of a cheap currency and low wages is producing a resurgence of exports this autumn.

As wholesalers and distributors from dozens of countries began snatching brochures and exchanging business cards at booths in exhibition halls with five times the floor area of the Empire State Building, Chinese exporters said they were convinced that sales were finally rebounding.

In one of the most telling signs of the recovery, container shipping prices from China to the West have jumped 50 percent in the past two months as exporters have scrambled for space on suddenly crowded decks. Chinese factories are hiring temporary workers to cope with a last-minute surge in orders, many of them from emerging markets but some from Europe and even from the economically weakened United States.

“We are very confident,” said Liu En Tian, the marketing manager of the Huasheng Jiangquan Group, a manufacturer of ceramic tiles in Linyi City. “Already the buyers who are coming this morning are more than last year.”

Like those of many Chinese manufacturers, his company’s exports fell by nearly half last winter because of the global economic slowdown, but they are now down only 20 percent from their peak more than a year ago because of a surge in sales to South America and the Middle East. “The economy will get better very soon” around the world, Mr. Liu said.

China’s many policies to help exporters, from tax breaks to currency market intervention, have relieved unemployment in China — but at the expense of contributing to it in other countries, and that is starting to fan trade tensions. President Barack Obama imposed tariffs starting at 35 percent on tires from China last month, and anti-dumping and anti-subsidy cases against China are piling up in front of trade tribunals around the world.

But as exporters here reviewed their orders for the coming months, they described a consistent pattern: Sales to emerging markets are recovering rapidly, demand from Europe is starting to rebound as the Chinese currency falls against the euro, and buying interest from the United States remains fairly weak.

The dollar has fallen steeply against the euro, the yen and most other currencies over the past two weeks, with a broad index of the U.S. currency dropping Thursday to its lowest level in 14 months. But the Chinese government has intervened heavily in currency markets to make sure that its currency, the yuan, falls with the dollar so that Chinese goods can maintain their competitive edge.

The result has been a steep slide in the value of the yuan against the euro, the yen, the Australian dollar and many other currencies that is making it cheaper for businesses from Helsinki to Sydney to order from China. The yuan has fallen 16 percent against the euro since early March and 31 percent against the Australian dollar.

“We export a lot to Australia,” said Linda Zhang, a sales manager at Hebei Wanlong Steel Structure, a maker of prefabricated housing, adding that demand from there “is coming back.”

Kimmo Tarkkonen, the chairman of SRS Fenno-El, a distributor of lamps and space heaters based on the outskirts of Helsinki, said that he was signing all his contracts with Chinese suppliers in dollars after concluding that the dollar would continue to fall against the euro, making his purchases even cheaper.

“We’d rather take the risk,” he said, adding that the dollar “is declining all the time, so the risk is minimal.”

Chinese exporters’ representatives at the fair said that they had not been able to raise prices for European customers as the euro rose because there was so much competition from other Chinese companies.

“Our price depends on our cost,” not the euro, said Nicola Hou, a marketer at Meizhou Koway Electronics, a clock radio manufacturer in Meizhou City in southeastern China.

Some Chinese companies, in another effort to remain competitive, have even been cutting prices after wages fell during the recent economic downturn — a decline from which they have not fully recovered. Most of the country has not yet seen a return of the labor shortages that started appearing shortly before the global economic downturn began.

“It’s easy to find workers — China has too much labor,” said Helen Chen, general manager of Yuyao Panasia International Trading in Yuyao in east-central China.

Transport costs are a small fraction of labor costs, but they are now rebounding from a vertiginous plunge over the winter. ...

Sunday, October 11, 2009

Skidelsky at LSE

I highly recommend this brilliant LSE lecture by Robert Skidelsky. Among the topics covered: risk versus uncertainty; Chicago school as the academic scribblers responsible for recent disastrous ideological capture; rational expectations, efficient markets and all that nonsense; economics as a science (not) and the role of mathematics; fiscal versus monetary stimulus; Glass-Steagall; Capital ascendant over Labor and Government: the renewed relevance of Marx; the role of globalization and the neoliberal agenda.

Before attacking me as a socialist pinko (I am not), listen to the talk or at least read the earlier post linked to below.


Keynes and the Crisis of Capitalism
(podcast and video available)

Date: Wednesday 7 October 2009
Speaker: Professor Lord Skidelsky

Robert Skidelsky is Emeritus Professor of Political Economy at the University of Warwick. His three-volume biography of the economist John Maynard Keynes (1983, 1992, 2000) received numerous prizes, including the Lionel Gelber Prize for International Relations and the Council on Foreign Relations Prize for International Relations. He is the author of The World After Communism (1995) (American edition called The Road from Serfdom). He was made a life peer in 1991, and was elected Fellow of the British Academy in 1994.

This event celebrates his latest book, Keynes: The Return of the Master.

For more from Skidelsky on Keynes and the current crisis, see this earlier post.

Keynes: ... the ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be quite exempt from intellectual influences, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.

Saturday, October 10, 2009

Spooks drowning in data

Almost every technical endeavor, from finance to high energy physics to biology to internet security to spycraft, is either already or soon to be drowning in Big Data. This is an inevitable consequence of exponential Moore's Laws in bandwidth, processing power, and storage, combined with improved "sensing" capability. The challenge is extracting meaning from all that data.

My impression is that the limiting factor at the moment is the human brainpower necessary to understand the idiosyncrasies of the particular problem, and, simultaneously, develop the appropriate algorithms. There are simply not enough people around who are good at this; it's not just a matter of algorithms, you need insight into the specific situation. Of equal importance is that the (usually non-technical) decision makers who have to act on the data need to have some rough grasp of the strengths and limitations of the methods, so as not to have to treat the results as coming from a black box.

To give you my little example of big data, on my desk (in Oakland, not in Eugene) I have stacks of terabyte drives with copies of essentially every Windows executable (program that runs on a flavor of Windows) that has appeared on the web in the past few years (about 5 percent of this is malware; also stored in our data is what each executable does once it's installed). Gathering this data was only modestly hard; analyzing it in a meaningful way is a lot harder!

NY Review of Books: On a remote edge of Utah's dry and arid high desert, where temperatures often zoom past 100 degrees, hard-hatted construction workers with top-secret clearances are preparing to build what may become America's equivalent of Jorge Luis Borges's "Library of Babel," a place where the collection of information is both infinite and at the same time monstrous, where the entire world's knowledge is stored, but not a single word is understood. At a million square feet, the mammoth $2 billion structure will be one-third larger than the US Capitol and will use the same amount of energy as every house in Salt Lake City combined.

Unlike Borges's "labyrinth of letters," this library expects few visitors. It's being built by the ultra-secret National Security Agency—which is primarily responsible for "signals intelligence," the collection and analysis of various forms of communication—to house trillions of phone calls, e-mail messages, and data trails: Web searches, parking receipts, bookstore visits, and other digital "pocket litter." Lacking adequate space and power at its city-sized Fort Meade, Maryland, headquarters, the NSA is also completing work on another data archive, this one in San Antonio, Texas, which will be nearly the size of the Alamodome.

Just how much information will be stored in these windowless cybertemples? A clue comes from a recent report prepared by the MITRE Corporation, a Pentagon think tank. "As the sensors associated with the various surveillance missions improve," says the report, referring to a variety of technical collection methods, "the data volumes are increasing with a projection that sensor data volume could potentially increase to the level of Yottabytes (1024 Bytes) by 2015."[1] Roughly equal to about a septillion (1,000,000,000,000,000,000,000,000) pages of text, numbers beyond Yottabytes haven't yet been named. Once vacuumed up and stored in these near-infinite "libraries," the data are then analyzed by powerful infoweapons, supercomputers running complex algorithmic programs, to determine who among us may be—or may one day become—a terrorist. In the NSA's world of automated surveillance on steroids, every bit has a history and every keystroke tells a story.

... Where does all this leave us? Aid concludes that the biggest problem facing the agency is not the fact that it's drowning in untranslated, indecipherable, and mostly unusable data, problems that the troubled new modernization plan, Turbulence, is supposed to eventually fix. "These problems may, in fact, be the tip of the iceberg," he writes. Instead, what the agency needs most, Aid says, is more power. But the type of power to which he is referring is the kind that comes from electrical substations, not statutes. "As strange as it may sound," he writes, "one of the most urgent problems facing NSA is a severe shortage of electrical power." With supercomputers measured by the acre and estimated $70 million annual electricity bills for its headquarters, the agency has begun browning out, which is the reason for locating its new data centers in Utah and Texas. And as it pleads for more money to construct newer and bigger power generators, Aid notes, Congress is balking.

The issue is critical because at the NSA, electrical power is political power. In its top-secret world, the coin of the realm is the kilowatt. More electrical power ensures bigger data centers. Bigger data centers, in turn, generate a need for more access to phone calls and e-mail and, conversely, less privacy. The more data that comes in, the more reports flow out. And the more reports that flow out, the more political power for the agency.

Rather than give the NSA more money for more power—electrical and political—some have instead suggested just pulling the plug. "NSA can point to things they have obtained that have been useful," Aid quotes former senior State Department official Herbert Levin, a longtime customer of the agency, "but whether they're worth the billions that are spent, is a genuine question in my mind."

Based on the NSA's history of often being on the wrong end of a surprise and a tendency to mistakenly get the country into, rather than out of, wars, it seems to have a rather disastrous cost-benefit ratio. Were it a corporation, it would likely have gone belly-up years ago. The September 11 attacks are a case in point. For more than a year and a half the NSA was eavesdropping on two of the lead hijackers, knowing they had been sent by bin Laden, while they were in the US preparing for the attacks. The terrorists even chose as their command center a motel in Laurel, Maryland, almost within eyesight of the director's office. Yet the agency never once sought an easy-to-obtain FISA warrant to pinpoint their locations, or even informed the CIA or FBI of their presence.

But pulling the plug, or even allowing the lights to dim, seems unlikely given President Obama's hawkish policies in Afghanistan. However, if the war there turns out to be the train wreck many predict, then Obama may decide to take a much closer look at the spy world's most lavish spender. It is a prospect that has some in the Library of Babel very nervous. "It was a great ride while it lasted," said one.

Friday, October 09, 2009

A Nobel for Barry?

Huh? In a few years he might have actually earned one. This just demonstrates the outright political biases of the Nobel Committee. The Nobel citation reads like this (without actually mentioning GWB): "Bush era bad, Obama good. Hope good."

Note I speak as an Obama supporter!

Will pundits on the left have the guts to point out that the emperor has no clothes? This kind of outcome just reinforces the paranoid fantasies of the far-right: that Obama's Harvard magna is fake, that the World Government has been grooming him for leadership since his student days, that Bill Ayers wrote Dreams From My Father, etc.

Speaking more broadly, it seems to me that our obsession with prizes (an offshoot of winner-take-all culture) is unhealthy, and that prizes these days are less and less correlated with actual achievement.

While my faith in the Nobel process is shaken (although commenters have already pointed out the Kissinger prize and of course there is always Modigliani-Miller ;-), my confidence in Obama is not -- he reacted properly.

Mr. Obama said he was "surprised and deeply humbled" by the committee’s decision, ... he said he would accept it as “a call to action.”

“To be honest,” the president said “I do not feel that I deserve to be in the company of so many of the transformative figures who have been honored by this prize, men and women who’ve inspired me and inspired the entire world through their courageous pursuit of peace.”

Someone just pointed out to me that this opens the door for a string theorist to win the physics prize ;-)

Wednesday, October 07, 2009

Schrodinger's virus



If the creature above (a tardigrade arthropod) can be placed in a superposition state, will you accept that you probably can be as well? And once you admit this, will you accept that you probably actually DO exist in a superposition state already?

It may be disturbing to learn that we live in a huge quantum multiverse, but was it not also disturbing for Galileo's contemporaries to learn that we live on a giant rotating sphere, hurtling through space at 30 kilometers per second? E pur si muove!

Related posts: Many Worlds: A brief guide for the perplexed, Are you gork?, Label: many worlds
Economist: ONE of the most famous unperformed experiments in science is Schrödinger’s cat. In 1935 Erwin Schrödinger (pictured), who was one of the pioneers of quantum mechanics, imagined putting a cat, a flask of Prussic acid, a radioactive atom, a Geiger counter, an electric relay and a hammer in a sealed box. If the atom decays, the Geiger counter detects the radiation and sends a signal that trips the relay, which releases the hammer, which smashes the flask and poisons the cat.

The point of the experiment is that radioactive decay is a quantum process. The chance of the atom decaying in any given period is known. Whether it has actually decayed (and thus whether the cat is alive or dead) is not—at least until the box is opened. The animal exists, in the argot of the subject, in a “superposition” in which it is both alive and dead at the same time.

Schrödinger’s intention was to illuminate the paradoxes of the quantum world. But superposition (the existence of a thing in two or more quantum states simultaneously) is real and is, for example, the basis of quantum computing. A pair of researchers at the Max Planck Institute for Quantum Optics in Garching, Germany, now propose to do what Schrödinger could not, and put a living organism into a state of quantum superposition.

The organism Ignacio Cirac and Oriol Romero-Isart have in mind is the flu virus. [arxiv] Pedants might object that viruses are not truly alive, but that is a philosophical rather than a naturalistic argument, for they have genes and are capable of reproduction—a capability they lose if they are damaged. The reason for choosing a virus is that it is small. Actual superposition (as opposed to the cat-in-a-box sort) is easiest with small objects, for which there are fewer pathways along which the superposition can break down. Physicists have already put photons, electrons, atoms and even entire molecules into such a state and measured the outcome. In the view of Dr Cirac and Dr Romero-Isart, a virus is just a particularly large molecule, so existing techniques should work on it.

The other thing that helps maintain superposition is low temperature. The less something jiggles about because of heat-induced vibration, the longer it can remain superposed. Dr Cirac and Dr Romero-Isart therefore propose putting the virus inside a microscopic cavity and cooling it down to its state of lowest energy (ground state, in physics parlance) using a piece of apparatus known as a laser trap. This ingenious technique—which won its inventors, one of whom was Steven Chu, now America’s energy secretary, a Nobel prize—works by bombarding an object with laser light at a frequency just below that which it would readily absorb and re-emit if it were stationary. This slows down the movement, and hence the temperature, of its atoms to a fraction of a degree above absolute zero.

Once that is done, another laser pulse will jostle the virus from its ground state into an excited state, just as a single atom is excited by moving one of its electrons from a lower to a higher orbital. By properly applying this pulse, Dr Cirac believes it will be possible to leave the virus in a superposition of the ground and excited states.

For that to work, however, the virus will need to have certain physical properties. It will have to be an insulator and to be transparent to the relevant laser light. And it will have to be able to survive in a vacuum. Such viruses do exist. The influenza virus is one example. Its resilience is legendary. It can survive exposure to a vacuum, and it seems to be an insulator—which is why the researchers have chosen it. And if the experiment works on a virus, they hope to move on to something that is indisputably alive: a tardigrade.

Tardigrades are tiny but resilient arthropods. They can survive in vacuums and at very low temperatures. And, although the difference between ground state and an excited state is not quite the difference between life and death, Schrödinger would no doubt have been amused that his 70-year-old jeu d’esprit has provoked such an earnest following.

Tuesday, October 06, 2009

More Brainpower and Globalization

The Chronicle of Higher Education is running a series of articles on the competitiveness of US higher education versus Asia; see here and related links (may require subscription). See also earlier posts Meritocracy in China , Brainpower and Globalization.

This describes the steps Tsinghua University took to recruit Turing Award winner Andrew Yao, going as far as effectively creating an entire department of theoretical computer science. (Interestingly, Yao first earned a PhD in physics before switching to computer science.)

Andrew Chi-Chih Yao's trajectory suggests a genius's quick ascent to success. Born in Shanghai, he studied hard, earned two Ph.D.'s from American universities, and at age 35 was a professor of computer science at Stanford University. By 2000, when he received the A.M. Turing Award, one of the most prestigious prizes in his field, younger computer scientists were memorizing a principle bearing his name.

But when Tsinghua University invited him to return to China to lead a new, generously financed institute in 2004, he jumped at the opportunity. "I was very excited," he recalls.

In creating the Institute for Theoretical Computer Science, Mr. Yao says, he had free rein from Tsinghua's administrators to determine everything from research topics to personnel.

Once in Beijing, he threw himself into the project, emerging with an institute that is today one of the university's jewels and is increasingly known outside China.

The way Mr. Yao's institute was created illustrates a central element of China's higher-education strategy. When the government decides it wants to do something, it does it, and fast.

Theoretical computer science—the abstract, intensely mathematical subfield that is Mr. Yao's specialty—is poorly financed in the United States. It wasn't originally a target area for China, either. But China has an approach different from that found in the United States: a desire to build outstanding institutions by attracting the leaders in a field—any field.

Mr. Yao was first approached about joining Tsinghua's faculty in 2003 by Chen Ning Yang, a Nobel Prize-winning physicist, who had himself recently begun lecturing at the university.

Through that and later discussions, Mr. Yao came to understand that the university's officials were more interested in hiring world-class talent than they were in building up specific areas.

"They want to catch up in a global way, and therefore it doesn't matter where they get started," Mr. Yao explains.

That hunger translates into money for leading Western-trained scientists willing to relocate. "China is a very exciting place for science and engineering for people who have vision," he says.

'Yao's Class'

But while the chance to build a world-class institute from scratch required little consideration, how exactly to do it in a country with a developing higher-education system was a difficult question to answer.

Mr. Yao recruited his first class of graduate students, only to discover that many of them lacked basic skills. If he wanted quality applicants, he realized, he would have to train them himself. So he established an undergraduate program within the institute, selecting about 100 students each year and then identifying the most talented ones—a group now known as Yao's Class—for intensive instruction.

The fact that he had the freedom to do so reflects another element of China's higher-education strategy, in which a select number of elite universities are allowed to flourish. ...

Friday, October 02, 2009

Baidu, baby, Baidu

Although Google sits astride the English (and Indo-European)-language Web like a colossus, their market share in Chinese is less than 25 percent and hasn't grown at all in recent years, even under the lead of celebrity hire Kai Fu Lee (who recently left Google to run his own venture fund). Search in China is dominated by Baidu, which appears to be Google's only rival in any language (eat your heart out, Quaero and Agence de l'innovation industrielle :-)

Given that China has the largest number of internet users (over 300 million and still growing very fast) and generates more search queries than the US, this is an interesting strategic development.

For a discussion of Baidu by founder Robin Li, watch this talk. The first 40 minutes or so will be old hat to any startup veteran, but the last 20 minutes is fairly interesting -- Li makes some interesting comments about the future of innovation in China. He notes that the Chinese language search index triples in size each year, whereas in English and other languages it only grows about 50% per year.






Here's a goofy idea for a sci fi story: a conversation or encounter between two big early AIs, one which evolved out of Google and its Indo-European corpus of data + "Mechanical Turk" / user input, and the other which evolved from Baidu and its primarily Sinic corpus and input. Would they be different in a fundamental way?

Thursday, October 01, 2009

Supermassive black holes and the entropy of the universe

New Scientist has an article about a recent paper by two Australian researchers (http://www.arxiv.org/abs/0909.3983), which contains detailed estimates of the entropy of various components of the universe (black holes, neutrinos, photons, etc.). This paper is related to some work I did with with Frampton, Kephart and Reeb: What is the entropy of the universe?. (Discussion on Cosmic Variance.)

Our work did not focus on the numerical values of various contributions to the entropy (we made some simple estimates), but rather what the physical meaning is of this entropy -- in particular, that of black holes; see excerpt below.

New Scientist: Mammoth black holes push universe to its doom

30 September 2009 by Rachel Courtland

THE mammoth black holes at the centre of most galaxies may be pushing the universe closer to its final fade-out. And it is all down to the raging disorder within those dark powerhouses.

Disorder is measured by a quantity called entropy, something which has been on the rise ever since the big bang. Chas Egan and Charles Lineweaver of the Australian National University in Canberra used the latest astrophysical data to calculate the total entropy of everything in the universe, from gas to gravitons. It turns out that supermassive black holes are by far the biggest contributors to the universe's entropy. Entropy reflects the number of possible arrangements of matter and energy in an object. The number of different configurations of matter a black hole could contain is staggering because its internal state is completely mysterious.

Egan and Lineweaver found that everything within the observable universe contains about 10^104 units of entropy (joules per Kelvin), a factor of 10 to 1000 times higher than previous estimates that did not include some of the biggest known black holes (www.arxiv.org/abs/0909.3983, submitted to The Astrophysical Journal).

If entropy were ever to reach a maximum level, that would mean the heat death of the universe. In this scenario no energy can flow, because everything is the same temperature and so life and other processes become impossible. "Our results suggest we're a little further along that road than previously thought," Egan says.

But although black holes do boost the universe's total entropy, it is not clear whether they will hasten its heat death. Supermassive black holes don't contribute much to the flows of heat that even out temperature throughout the universe, says physicist Stephen Hsu at the University of Oregon in Eugene.

It's true that these black holes will slowly evaporate by releasing Hawking radiation, particles created near the boundary of the black hole. And this radiation could move the universe towards heat death.

Black holes may evaporate via Hawking radiation, tipping the universe towards its heat death. However, it will take some 10^102 years for a supermassive black hole to evaporate. "The entropy inside those black holes is effectively locked up in there forever," Hsu says. So we may have reached a state approaching heat death long before, as stars burn out and their matter decays.

The large result obtained by the Egan and Lineweaver for the entropy of the universe is primarily due to supermassive black holes. How do we interpret that entropy? Here is an excerpt from our paper (excuse the latex).

Note the entropy used in this paper describes the uncertainty in the precise quantum state of a system. If the system is macroscopic the full quantum state is only accessible to a kind of ``super-observer'' who is unaffected by decoherence \cite{decoherence}. Individual observers within the system who have limited experimental capabilities can only detect particular decoherent outcomes. These outcomes arise, e.g., from an effective density matrix that results from tracing over degrees of freedom which are out of the experimenter's control (i.e., which form the ``environment''). In \cite{BID} the experimental capabilities necessary to distinguish decoherent branches of the wavefunction, or, equivalently, the precise quantum state of Hawking radiation from a black hole, are discussed. It is shown that a super-observer would either need (at minimum) the capability of making very precise measurements of accuracy $\exp(- M^2 )$ (see also the proposal of Maldacena \cite{eternal} for a specific measurement to determine whether black hole evaporation is unitary), or alternatively the capability of engineering very precise non-local operators, which measure a large fraction of the Hawking radiation at once, including correlations (i.e., as opposed to ordinary particle detectors, which only measure Fock state occupation numbers and are not sensitive to phase information).

An observer who lacks the capabilities described in the previous paragraph would be unable to distinguish the states in the $S = M^{3/2}$ subspace in Fig.~\ref{figure1} from those in the larger $S = M^2$ subspace, assuming the unitary evaporation resembles, in gross terms, Hawking evaporation, with the information hidden in correlations among the emitted quanta. In that case, the future uncertainty for ordinary (non-super) observers might be better characterized by the larger $S = M^2$ entropy. Putting it another way, an ordinary (non-super) observer is forced (due to experimental limitations) into a coarse grained description of the radiation; they cannot distinguish between most of the radiation states, and for them the $S = M^2$ entropy is appropriate. For a super-observer, however, due to unitary evolution, the uncertainty in the quantum state does not increase. For them, black holes do not have greater entropy than the precursor states from which they formed.

For the super-observers described above, the large black hole entropies in Table I do not reflect the actual uncertainties in the (current and future) state of the universe and are in that sense misleading. A black hole of mass $M$ whose formation history is typical for our universe (e.g., it originated from gravitational collapse of a star or galactic core) satisfies the bound S [less than] M^{3/2} \cite{MI}. Thus, re-evaluating the numbers in Table I, the total entropy of all black holes in our universe is not bigger than the total matter entropy: the dominant uncertainty in the precise state of the universe, at least as far as arises from known physics, is, in fact, due to CMB photons or neutrinos.





Figure 3 caption: Ordinary matter (star, galactic core, etc.) collapses to form an astrophysical black hole. Under unitary evolution, the number of final Hawking radiation states that are actually accessible from this collapse is $\sim \exp M^{3/2}$, i.e.~precisely the number of ordinary astrophysical precursors (\ref{th1}). It is therefore much smaller than the the number of $\sim \exp M^2$ states a black hole, and its eventual Hawking radiation, could possibly occupy if nothing about its formation process were known.


Final mysterious comment, maximally compressed for the cognoscenti: assuming unitarity, black holes do not push us closer to heat death (equilibrium) in the multiverse, but can contribute (albeit very slowly) to the (coarse grained) heat death experienced by a non-super observer (i.e., an observer subject to decoherence). See here for more on equilibrium in the multiverse.