Sunday, February 03, 2013

Quantum mechanics of black holes

A paper from last summer by Almheiri, Marolf, Polchinski and Sully (AMPS) has stimulated a lot of new work on the black hole information problem. At the time I was only able to follow it superficially as I was busy with my new position at MSU. But finally I've had time to think about it more carefully -- see this paper.
http://arxiv.org/abs/1302.0451

Macroscopic superpositions and black hole unitarity

We discuss the black hole information problem, including the recent claim that unitarity requires a horizon firewall, emphasizing the role of decoherence and macroscopic superpositions. We consider the formation and evaporation of a large black hole as a quantum amplitude, and note that during intermediate stages (e.g., after the Page time), the amplitude is a superposition of macroscopically distinct (and decohered) spacetimes, with the black hole itself in different positions on different branches. Small but semiclassical observers (who are themselves part of the quantum amplitude) that fall into the hole on one branch will miss it entirely on other branches and instead reach future infinity. This observation can reconcile the subjective experience of an infalling observer with unitarity. We also discuss implications for the nice slice formulation of the information problem, and to complementarity.

Two good introductions to horizon firewalls and AMPS, by John Preskill and Joe Polchinski.

Earlier posts on this blog of related interest: here, here and here. From discussion at the third link (relevant, I claim, to AMPS):
Hawking claimed bh's could make a pure state evolve to a mixed state. But decoherence does this all the time, FAPP. To tell whether it is caused by the bh rather than decoherence, one needs to turn off (defeat) the latter. One has to go beyond FAPP!

FAPP = Bell's term = "For All Practical Purposes"
In the paper I cite a famous article by Bell, Against Measurement, which appeared in Physics World  in 1990, and which emphasizes the distinction between actual pure to mixed state evolution, and its apparent, or FAPP, counterpart (caused by decoherence). This distinction is central to an understanding of quantum foundations. The article can be a bit hard to find so I am including the link above.

Slides from an elementary lecture: black holes, entropy and information.



4 comments:

  1. esmith3:18 PM

    I may be misunderstanding some part of the debate, but it seems to me that most physicists involved operate under a fundamentally wrong paradigm. When you're a quantum physicist, it's natural and tempting to think of a black hole as a weird region in the spacetime that is still essentially Minkowski in topology and geometry outside BH. The reality - that BH fundamentally modifies the topology - is carefully avoided as unpleasant. And so we see nonsensical claims, e.g. "when a bit is thrown into a black hole, then as long as there is a minimum time of order rs ln(rs=lP) before the bit thermalizes..." (AMPS, p.1. In reality, there's no preferred clock in the black hole, one can't even cogently speak about clocks because "time-translation" Killing trajectories inside the event horizon are spacelike. Therefore, this whole statement is meaningless.) "The process of formation and evaporation of a black hole, as viewed by a distant observer ..." (AMPS, p.2. In reality, no distant observer can't possibly observe formation AND evaporation of a black hole. It is elementary GR that a distant observer can't even observe formation of a black hole, both formation and evaporation occur at t->infinity.)

    ReplyDelete
  2. Steve, what do you think of the "fuzzball" hypothesis about black holes (also see my tribute here :) )? Sounds like a great idea, yet it doesn't seem be getting as much traction, as far as I can see.

    ReplyDelete
  3. Very nice paper, Steve. I read the whole thing and subscribe to it completely.

    ReplyDelete
  4. esmith8:23 PM

    Now that (I think) I understand the AMPS paper, I think that there's an issue of magnitude.

    * OK, so early Hawking radiation is entangled with late Hawking radiation. No problem.

    * OK, so by observing early Hawking radiation we can predict late radiation. Strictly speaking, we need to build a Dyson sphere around the black hole and capture its entire output, or a big part of it, before we send Alice in, otherwise we won't have enough data to make predictions. But let's ignore this for a moment.

    * Then it follows that Alice will not see perfect vacuum in the vicinity of the event horizon. She'll see _something_. AMPS don't actually say what _something_ is. They say that Alice can make predictions about numbers of Hawking quanta and that these same low-energy quanta become high-energy near the event horizon.

    What I don't see is how we get from _something_ (which may be just a smattering of high-energy quanta) to a firewall that incinerates all infalling observers.

    Furthermore, there's a related argument in arXiv:1207.6626 that states that our measurements of early radiation are extremely unlikely to put as in an eigenstate of the number of future Hawking quanta to begin with.

    In response to this, AMPS start talking about running a quantum computation. In plain English, not only do we need a Dyson sphere, but we also need a custom-designed quantum computer that is fed all Hawking radiation and designed to put us specifically in the eigenstate of the number of future quanta, as a prerequisite for getting Alice incinerated. That is already a very different story from where we started: a curious but fairly irrelevant factoid.

    ReplyDelete