Pessimism of the Intellect, Optimism of the Will     Archive   Favorite posts   Twitter: @steve_hsu

Showing posts with label quantum mechanics. Show all posts
Showing posts with label quantum mechanics. Show all posts

Tuesday, March 18, 2014

Patterns on the sky

I'm busy reviewing ~200 promotion and tenure cases for my day job, so I don't have much time to post about the BICEP2 observation of primordial gravitational waves via their effect on the polarization of the cosmic microwave background (CMB).

Instead, I refer you to Sean Carroll, Lubos Motl and Liam McAllister (guest poster at Lubos' blog).

Assuming the result holds up, it strongly supports inflationary cosmology, and indicates that the inflation scale is only about 2 orders of magnitude below the Planck scale ~ 10^19 GeV (which would, presumably, turn out to be the true scale of quantum gravity).

In inflationary cosmology the gravitational waves which left the polarization signal arise from quantum fluctuations in de Sitter space. As with the CMB temperature, observers on different branches of the wavefunction of the universe see distinct polarization patterns on the sky. Since the CMB temperature fluctuations track energy density, these different observers also see distinct patterns of galaxy formation. In fact, whether or not an observer (a planet or galaxy) exists in a particular region of spacetime depends on the branch of the wavefunction (i.e., on a measurement outcome). I can't tell a Copenhagen story that makes sense of this -- there is no way to place observers like ourselves outside of the quantum state describing the CMB!

I guess I've said this all before 8-)
In fact, the interpretation of quantum mechanics is not entirely disconnected from practical issues in cosmology. The cosmic microwave background data favors inflationary cosmology, in which the density perturbations in the early universe originate in the quantum fluctuations of the inflaton field itself. It is very hard to fit this into the Copenhagen view -- what "collapses" the wavefunction of the inflaton? There are no "observers" in the early universe, and the locations of "observers" (such as humans) are determined by the density perturbations themselves: galaxies, stars and planets are found in the overdense regions, but quantum mechanics itself decides which regions are overdense; there is no classical system "outside" the universe! It seems much more natural to note that differential scattering of gravitons due to more or less energy density in a particular region separates the inflaton wavefunction into decoherent branches. (The gravitons decohere the inflaton state vector through interactions.) But this is accomplished through unitary evolution and does not require von Neumann projection or "collapse". Other observers, living on other branches of the wavefunction, see a different CMB pattern on the sky.

Friday, March 07, 2014

Weinberg on Weinberg

Great interview with Steve Weinberg about his life in science.

See also Weinberg on quantum foundations.

Wednesday, January 29, 2014

Locality and Nonlinear Quantum Mechanics

New paper!
Locality and Nonlinear Quantum Mechanics

Chiu Man Ho, Stephen D.H. Hsu

Nonlinear modifications of quantum mechanics generically lead to nonlocal effects which violate relativistic causality. We study these effects using the functional Schrodinger equation for quantum fields and identify a type of nonlocality which causes nearly instantaneous entanglement of spacelike separated systems.
Some excerpts:
The linear structure of quantum mechanics has deep and important consequences, such as the behavior of superpositions. One is naturally led to ask whether this linearity is fundamental, or merely an approximation: Are there nonlinear terms in the Schrodinger equation?

Nonlinear quantum mechanics has been explored in [1–6]. It has been observed that the fictitious violation of locality in the Einstein-Podolsky-Rosen (EPR) experiment in conventional linear quantum mechanics might become a true violation due to nonlinear effects [7, 8] (in [8] signaling between Everett branches is also discussed). This might allow superluminal communication and violate relativistic causality. These issues have subsequently been widely discussed [9].

Properties such as locality or causality are difficult to define in non-relativistic quantum mechanics (which often includes, for example, “instantaneous” potentials such as the Coulomb potential). Therefore, it is natural to adopt the framework of quantum field theory: Lorentz invariant quantum field theories are known to describe local physics with relativistic causality (influences propagate only within the light cone), making violations of these properties easier to identify. ...

... Our results suggest that nonlinearity in quantum mechanics is associated with violation of relativistic causality. We gave a formulation in terms of factorized (unentangled) wavefunctions describing spacelike separated systems. Nonlinearity seems to create almost instantaneous entanglement of the two systems, no matter how far apart. Perhaps our results are related to what Weinberg [11] meant when he wrote “... I could not find any way to extend the nonlinear version of quantum mechanics to theories based on Einstein’s special theory of relativity ... At least for the present I have given up on the problem: I simply do not know how to change quantum mechanics by a small amount without wrecking it altogether.”

Friday, October 04, 2013

Fuzzballs, black holes and firewalls

Yesterday Samir Mathur gave a colloquium here on the black hole information paradox. I've known Samir for many years; he was an assistant professor at MIT when I was a postdoc up the river. I've always found him to be a very precise and clear thinker.

On his web page there is a very simple introduction to the paradox. The initial presentation emphasizes the role of negative binding energy in black hole physics, which is related to the question of monsters: configurations in classical general relativity with more entropy than black holes of the same mass. (Slides.)

Here is a recent paper in which Samir discusses the black hole firewall problem and subadditivity of entropy.

Samir in action giving a more technical seminar earlier today:

Wednesday, September 04, 2013

Big brains battle black hole firewalls

When I gave an informal whiteboard talk on this topic at IQIM I remarked that after almost 30 years (Hawking first proposed that black holes destroy quantum information in 1974), theorists are still baffled by the black hole information paradox.

Three recent blog posts on the information problem and firewalls:

Scott Aaronson (see lively discussion),  John Preskill (I stole the picture from John), Lubos Motl (I think Lubos has the physics right in his post but I would probably more polite to our colleagues about it  ;-)

Earlier post on this blog. My recent paper -- see eqns (3)-(5) for discussion of density matrix similar to Motl's. Like Lubos and Preskill (and everyone else?), I was never convinced by Hawking's concession paper on the unitarity question, but I do acknowledge some similarities between his arguments and mine.

Finally, in recent discussions with Samir Mathur I became aware of his paper What the information paradox is not, which I recommend. (See especially the section on AdS/CFT.)

Tuesday, August 27, 2013

Factorization of unitarity and black hole firewalls
Factorization of unitarity and black hole firewalls
Stephen D.H. Hsu

Unitary black hole evaporation necessarily involves a late-time superposition
of decoherent states, including states describing distinct spacetimes (e.g.,
different center of mass trajectories of the black hole). Typical analyses of
the black hole information problem, including the argument for the existence of
firewalls, assume approximate unitarity ("factorization of unitarity") on each
of the decoherent spacetimes. This factorization assumption is non-trivial, and
indeed may be incorrect. We describe an ansatz for the radiation state that
violates factorization and which allows unitarity and the equivalence principle
to coexist (no firewall). Unitarity without factorization provides a natural
realization of the idea of black hole complementarity.
From the paper:
... An objection to the importance of macroscopic superpositions to the information problem is that there is much less information in the coarse grained position or even trajectory (sequence of positions) of the black hole than in the radiation. From this perspective one should be able to neglect the superposition of spacetimes and demand approximate unitarity branch by branch -- in other words, impose factorization. Below, we show that the firewall argument depends sensitively on the precision of factorization. Once macroscopic superpositions are taken into account, the required deviation of near-horizon modes from the inertial vacuum state becomes extremely small. ...


The quantum evolution of a complex pure state typically leads to a superposition of decoherent semiclassical states. In the case of black hole evaporation one obtains a superposition of spacetime geometries because the Hawking radiation inevitably exhibits fluctuations in energy and momentum density over different regions. Firewall and information paradoxes result from the non-trivial assumption of factorization: approximate unitarity on each decoherent geometry. Global unitarity is a much weaker condition than factorization. Quantum correlations between geometries can plausibly resolve the information paradoxes, although specific dynamical mechanisms are still not understood.

Monday, August 26, 2013

Heretics in the church: black hole information loss

Bob Wald give a nice talk from the "relativist perspective" at the KITP workshop on firewalls -- see Fri Aug 23 3 PM ; slides. (@36 min things heat up a bit :-)

One of the ideas that he and Bill Unruh have advocated over the years is that decoherence is an example of pure to mixed state evolution that doesn't require catastrophic side-effects like energy non-conservation (Banks, Peskin, Susskind; B-S interchange with Wald @42min :-). For related discussion, see these papers: BHs and spacetime topology, BHs and decoherence and this talk from an earlier KITP workshop by Bill Unruh.

Bob also takes some shots at the church of AdS/CFT, pointing out that the duality is not well-defined and still a conjecture. If AdS/CFT is the strongest argument in favor of purity of BH evaporation, then one should not abandon alternatives just yet ... (@53min some further discussion of this which unfortunately cuts off just as Maldacena is giving an interesting argument!)

Thursday, August 22, 2013

Black hole firewalls and all that

Here's the Times coverage of the so-called "firewall paradox" in black hole physics. For my take on it see this paper and these additional comments. There is a meeting at KITP on this topic taking place starting this week -- see here for talk videos.
NYTimes: ... A high-octane debate has broken out among the world’s physicists about what would happen if you jumped into a black hole, a fearsome gravitational monster that can swallow matter, energy and even light. You would die, of course, but how? Crushed smaller than a dust mote by monstrous gravity, as astronomers and science fiction writers have been telling us for decades? Or flash-fried by a firewall of energy, as an alarming new calculation seems to indicate?

This dire-sounding debate has spawned a profusion of papers, blog posts and workshops over the last year. At stake is not Einstein’s reputation, which is after all secure, or even the efficacy of our iPhones, but perhaps the basis of his general theory of relativity, the theory of gravity, on which our understanding of the universe is based. Or some other fundamental long-established principle of nature might have to be abandoned, but physicists don’t agree on which one, and they have been flip-flopping and changing positions almost weekly, with no resolution in sight.

“I was a yo-yo on this,” said one of the more prolific authors in the field, Leonard Susskind of Stanford. He paused and added, “I haven’t changed my mind in a few months now.”

Raphael Bousso, a theorist at the University of California, Berkeley, said, “I’ve never been so surprised. I don’t know what to expect.”

You might wonder who cares, especially if encountering a black hole is not on your calendar. But some of the basic tenets of modern science and of Einstein’s theory are at stake in the “firewall paradox,” as it is known.

“It points to something missing in our understanding of gravity,” said Joseph Polchinski, of the Kavli Institute for Theoretical Physics in Santa Barbara, Calif., one of the theorists who set off this confusion. ...
Via Sean Carroll, here are Joe Polchinski's slides from a firewall talk at Caltech.

My claim is that (see slide 29) the b which forms a pure state with b_E is not the same as the b which forms a pure state with b'. The latter b is an excitation relative to the vacuum state of a particular decoherent spacetime (background geometry) whereas the former b is a component of the global radiation state, summing over all spacetimes. The Equivalence Principle (no drama) can only be applied to one geometry at a time, whereas unitarity (purity) only applies to the global state, including all the branches.

If I am correct, then the main benefit from this firewall discussion will be the realization that unitarity only holds after summing over all spacetime geometries of the evaporating black hole. Most theorists seem to think it will hold (at least approximately) on each geometry separately.

Wednesday, July 24, 2013

Caltech Institute for Quantum Information and Matter

IQIM is the home of John Preskill, the Richard P. Feynman Professor of Theoretical Physics at Caltech. John was celebrated on the occasion of his 60th birthday here.

Dinner meeting with the group.

Working in the Pasadena sunshine.

Inside the Annenberg Center.

On the first floor there are some old plaques, including this one honoring Chris Chang :-)

Some random Caltech photos I took. Go Beavers!

Sunday, April 14, 2013

Bezos on the big brains

I recall reading this quote (or something similar) when Bezos was Time magazine's Man of the Year in 1999.
Jeff Bezos: Yeah. So, I went to Princeton primarily because I wanted to study physics, and it's such a fantastic place to study physics. Things went fairly well until I got to quantum mechanics and there were about 30 people in the class by that point and it was so hard for me. I just remember there was a point in this where I realized I'm never going to be a great physicist. There were three or four people in the class whose brains were so clearly wired differently to process these highly abstract concepts, so much more. I was doing well in terms of the grades I was getting, but for me it was laborious, hard work. And, for some of these truly gifted folks -- it was awe-inspiring for me to watch them because in a very easy, almost casual way, they could absorb concepts and solve problems that I would work 12 hours on, and it was a wonderful thing to behold. At the same time, I had been studying computer science, and was really finding that that was something I was drawn toward. I was drawn to that more and more and that turned out to be a great thing. So I found -- one of the great things Princeton taught me is that I'm not smart enough to be a physicist.
It turns out I know several of Bezos' contemporaries at Princeton (class of 1986), including some members of his eating club, and possibly some of the individuals described above. See this old post, Living Like Kings:
Physics library, LeConte Hall, Berkeley, 1987. Studying string theory and Calabi-Yau tomfoolery about 100m from the Campanile in the picture above. We'll never have it better than that.

Me: Mike, I can't believe we're in here working on such a beautiful afternoon. Look at that sunshine!

Mike C. (the pride of Jadwin Hall): Hsu, we're doing exactly what we want to be doing. We're livin' like kings, man! Livin' like kings (big grin).

See also One hundred thousand brains and Defining Merit:
... Bender also had a startlingly accurate sense of how many truly intellectually outstanding students were available in the national pool. He doubted whether more than 100-200 candidates of truly exceptional promise would be available for each year's class. This number corresponds to (roughly) +4 SD in mental ability. Long after Bender resigned, Harvard still reserved only 10 percent of its places (roughly 150 spots) for "top brains".

Sunday, February 17, 2013

Weinberg on quantum foundations

I have been eagerly awaiting Steven Weinberg's Lectures on Quantum Mechanics, both because Weinberg is a towering figure in theoretical physics, and because of his cryptic comments concerning the origin of probability in no collapse (many worlds) formulations:
Einstein's Mistakes
Steve Weinberg, Physics Today, November 2005

Bohr's version of quantum mechanics was deeply flawed, but not for the reason Einstein thought. The Copenhagen interpretation describes what happens when an observer makes a measurement, but the observer and the act of measurement are themselves treated classically. This is surely wrong: Physicists and their apparatus must be governed by the same quantum mechanical rules that govern everything else in the universe. But these rules are expressed in terms of a wavefunction (or, more precisely, a state vector) that evolves in a perfectly deterministic way. So where do the probabilistic rules of the Copenhagen interpretation come from?

Considerable progress has been made in recent years toward the resolution of the problem, which I cannot go into here. [ITALICS MINE. THIS REMINDS OF FERMAT'S COMMENT IN THE MARGIN!] It is enough to say that neither Bohr nor Einstein had focused on the real problem with quantum mechanics. The Copenhagen rules clearly work, so they have to be accepted. But this leaves the task of explaining them by applying the deterministic equation for the evolution of the wavefunction, the Schrödinger equation, to observers and their apparatus. The difficulty is not that quantum mechanics is probabilistic—that is something we apparently just have to live with. The real difficulty is that it is also deterministic, or more precisely, that it combines a probabilistic interpretation with deterministic dynamics. ...
Weinberg's coverage of quantum foundations in section 3.7 of the new book is consistent with what is written above, although he does not resolve the question of how probability arises from the deterministic evolution of the wavefunction. (See here for my discussion, which involves, among other things, the distinction between objective and subjective probabilities; the latter can arise even in a deterministic universe).

1. He finds Copenhagen unsatisfactory: it does not allow QM to be applied to the observer and measuring process; it does not have a clean dividing line between observer and system.

2. He finds many worlds (no collapse, decoherent histories, etc.) unsatisfactory not because of the so-called basis problem (he accepts the unproved dynamical assumption that decoherence works as advertised), but rather because of the absence of a satisfactory origin of the Born rule for probabilities. (In other words, he doesn't elaborate on the "considerable progress..." alluded to in his 2005 essay!)

Weinberg's concluding paragraph:
There is nothing absurd or inconsistent about the ... general idea that the state vector serves only as a predictor of probabilities, not as a complete description of a physical system. Nevertheless, it would be disappointing if we had to give up the "realist" goal of finding complete descriptions of physical systems, and of using this description to derive the Born rule, rather than just assuming it. We can live with the idea that the state of a physical system is described by a vector in Hilbert space rather than by numerical values of the positions and momenta of all the particles in the system, but it is hard to live with no description of physical states at all, only an algorithm for calculating probabilities. My own conclusion (not universally shared) is that today there is no interpretation of quantum mechanics that does not have serious flaws [italics mine] ...
It is a shame that very few working physicists, even theoreticians, have thought carefully and deeply about quantum foundations. Perhaps Weinberg's fine summary will stimulate greater awareness of this greatest of all unresolved problems in science.
"I am a Quantum Engineer, but on Sundays I have principles." -- J.S. Bell

Sunday, February 10, 2013

Quantum correspondence on black holes

Some Q&A from correspondence about my recent paper on quantum mechanics of black holes. See also Lubos Motl's blog post.
Q: Put another way (loosely following Bousso), consider two observers, Alice and Bob. They steer their rocket toward the black hole and into "the zone" or "the atmosphere". Then, Bob takes a lifeboat and escapes to asymptotic infinity while Alice falls in. I hope you agree that Bob and Alice's observations should agree up to the point where their paths diverge. On the other hand, it seems that Bob, by escaping to asymptotic infinity can check whether the evolution is unitary (or at least close to unitary). I wonder which parts of this you disagree with.

A: If Bob has the measurement capability to determine whether Psi_final is a unitary evolution of Psi_initial, he has to be able to overcome decoherence ("see all the branches"). As such, he cannot have the same experience as Alice of "going to the edge of the hole" -- that experience is specific to a certain subset of branches. (Note, Bob not only has to be able to see all the semiclassical branches, he also has to be able to detect small amplitude branches where some of the information leaked out of the hole, perhaps through nonlocality.) To me, this is the essential content of complementarity: the distinction between a super-observer (who can overcome decoherence) and ordinary observers who cannot. Super-observers do not (in the BH context) record a semiclassical spacetime, but rather a superposition of such (plus even more wacky low amplitude branches).

In the paragraph of my paper that directly addresses AMPS, I simply note that the "B" used in logical steps (1) and (2) are totally different objects. One is defined on a specific branch, the other is defined over all branches. Perhaps an AMPS believer can reformulate my compressed version of the firewall argument to avoid this objection, but I think the original argument has a problem.

Q: I think all one needs for the AMPS argument is that the entanglement entropy of the radiation decreases with the newly emitted quanta. This is, of course, a very tough quantum computation, but I don't see the obstruction to it being run on disparate semiclassical branches to use your language. I was imagining doing a projective measurement of the position of the black hole (which should be effectively equivalent to decoherence of the black hole's position); this still leaves an enormous Hilbert space of states associated with the black hole unprojected/cohered. I am not sure whether you are disagreeing with that last statement or not, but let me proceed. Then it seems we are free to run the AMPS argument. There is only a relatively small amount of entanglement between the remaining degrees of freedom and the black hole's position. Thus, unitarity (and Page style arguments) suggest a particular result for the quantum computation mentioned above by Bob on whichever branch of the wave function we are discussing (which seems to be effectively the same branch that Alice is on).

A: An observer who is subject to the decoherence that spatially localizes the hole would see S_AB to be much larger than S_A, where A are the (early, far) radiation modes and B are the near-horizon modes. This is because it takes enormous resources to detect the AB entanglement, whereas A looks maximally mixed. I think this is discussed rather explicitly in arXiv:1211.7033 -- one of Nomura's papers that he made me aware of after I posted my paper. Measurements related to unitarity, purity or entanglement of, e.g., AB modes, can only be implemented by what I call super-observers: they would see multiple BH spacetimes. Since at least some A and B modes move on the light cone, these operations may require non-local actions by Bob.

Q: Do you think there is an in-principle obstruction that prevents observers from overcoming decoherence? Is there some strict delineation between what can be put in a superposition and what cannot?

A: This is an open question in quantum foundations: i.e., at what point are there not enough resources in the entire visible universe to defeat decoherence -- at which point you have de facto wavefunction collapse. Omnes wrote a whole book arguing that once you have decoherence due to Avogadro's number of environmental DOF, the universe does not contain sufficient resources to detect the other branch. It does seem true to me that if one wants to make the BH paradox sharp, which requires that the mass of the BH be taken to infinity, then, yes, there is an in-principle gap between the two. The resources required grow exponentially with the BH entropy.

Sunday, February 03, 2013

Quantum mechanics of black holes

A paper from last summer by Almheiri, Marolf, Polchinski and Sully (AMPS) has stimulated a lot of new work on the black hole information problem. At the time I was only able to follow it superficially as I was busy with my new position at MSU. But finally I've had time to think about it more carefully -- see this paper.

Macroscopic superpositions and black hole unitarity

We discuss the black hole information problem, including the recent claim that unitarity requires a horizon firewall, emphasizing the role of decoherence and macroscopic superpositions. We consider the formation and evaporation of a large black hole as a quantum amplitude, and note that during intermediate stages (e.g., after the Page time), the amplitude is a superposition of macroscopically distinct (and decohered) spacetimes, with the black hole itself in different positions on different branches. Small but semiclassical observers (who are themselves part of the quantum amplitude) that fall into the hole on one branch will miss it entirely on other branches and instead reach future infinity. This observation can reconcile the subjective experience of an infalling observer with unitarity. We also discuss implications for the nice slice formulation of the information problem, and to complementarity.

Two good introductions to horizon firewalls and AMPS, by John Preskill and Joe Polchinski.

Earlier posts on this blog of related interest: here, here and here. From discussion at the third link (relevant, I claim, to AMPS):
Hawking claimed bh's could make a pure state evolve to a mixed state. But decoherence does this all the time, FAPP. To tell whether it is caused by the bh rather than decoherence, one needs to turn off (defeat) the latter. One has to go beyond FAPP!

FAPP = Bell's term = "For All Practical Purposes"
In the paper I cite a famous article by Bell, Against Measurement, which appeared in Physics World  in 1990, and which emphasizes the distinction between actual pure to mixed state evolution, and its apparent, or FAPP, counterpart (caused by decoherence). This distinction is central to an understanding of quantum foundations. The article can be a bit hard to find so I am including the link above.

Slides from an elementary lecture: black holes, entropy and information.

Tuesday, November 20, 2012

Schwinger on quantum foundations

The excerpt below is from the excellent biography Climbing the Mountain by Mehra and Milton. Milton was one of Schwinger's last Harvard grad students, eventually a professor at the University of Oklahoma. Schwinger's view is the one shared by all reasonable physicists: quantum mechanics must apply to the measuring device as well as that which is measured. Once this assumption is made (as Hawking and others have noted): many worlds follows trivially.
(p.369) Schwinger: "To me, the formalism of quantum mechanics is not just mathematics; rather it is a symbolic account of the realities of atomic measurements. That being so, no independent quantum theory of measurement is required -- it is part and parcel of the formalism.

[ ... recapitulates usual von Neumann formulation: unitary evolution of wavefunction under "normal" circumstances; non-unitary collapse due to measurement ... discusses paper hypothesizing stochastic (dynamical) wavefunction collapse ... ]

In my opinion, this is a desperate attempt to solve a non-existent problem, one that flows from a false premise, namely the vN dichotomization of quantum mechanics. Surely physicists can agree that a microscopic measurement is a physical process, to be described as would any physical process, that is distinguished only by the effective irreversibility produced by amplification to the macroscopic level. ..."
Similar views have been expressed by Feynman and Gell-Mann and by Steve Weinberg. Interestingly, this chapter in the biography seems to describe (in slightly odd language) some Schwinger work on decoherence, analyzing a collaborator's claim that Stern-Gerlach beams could be recombined coherently.

See also my paper On the origin of probability in quantum mechanics.

Schwinger's precocity, explored in the biography in far greater detail than I had seen before, is overwhelming. At age 17 or so he had read everything there was to read about quantum mechanics, early field theory, nuclear and atomic physics. For example, he had read and understood Dirac's papers, had invented the interaction picture basis, had already read the Einstein, Podolsky, Rosen paper and explained it to Rabi when they first met. He met Bethe and they discussed a problem in quantum scattering (Schwinger had improved Bethe's well-known result and noticed an error that no other theorist had). Bethe later wrote that the 17 year old Schwinger's grasp of quantum electrodynamics was at least as good as his own.
Feyerabend on the giants: "... The younger generation of physicists, the Feynmans, the Schwingers, etc., may be very bright; they may be more intelligent than their predecessors, than Bohr, Einstein, Schrodinger, Boltzmann, Mach and so on. ..."
Schwinger survived both Feynman and Tomonaga, with whom he shared the Nobel prize for quantum electrodynamics. He began his eulogy for Feynman: "I am the last of the triumvirate ..."

Tuesday, October 09, 2012

Schrodinger cat Nobels

Serge Haroche and Dave Wineland share the 2012 Nobel Prize for their work in quantum optics / atomic physics. Wineland traps atoms whereas Haroche traps photons. Haroche is a normalien and Wineland was educated at Berkeley and Harvard.

My favorite Haroche experiments are the ones in which he creates macroscopic Schrodinger cat states and watches them decohere. For example, see here and here. I also like Haroche's book Exploring the Quantum.

See also Schrodinger's virus.

Thursday, August 09, 2012

Gell-Mann, Feynman, Everett

This site is a treasure trove of interesting video interviews -- including with Francis Crick, Freeman Dyson, Sydney Brenner, Marvin Minsky, Hans Bethe, Donald Knuth, and others. Many of the interviews have transcripts, which are much faster to read than listening to the interviews themselves.

Here's what Murray Gell-Mann has to say about quantum foundations:
In '63…'64 I worked on trying to understand quantum mechanics, and I brought in Felix Villars and for a while some comments... there were some comments by Dick Feynman who was nearby. And we all agreed on a rough understanding of quantum mechanics and the second law of thermodynamics and so on and so on, that was not really very different from what I'd been working on in the last ten or fifteen years.

I was not aware, and I don't think Felix was aware either, of the work of Everett when he was a graduate student at Princeton and worked on this, what some people have called 'many worlds' idea, suggested more or less by Wheeler. Apparently Everett was, as we learned at the Massagon [sic] meeting, Everett was an interesting person. He… it wasn't that he was passionately interested in quantum mechanics; he just liked to solve problems, and trying to improve the understanding of quantum mechanics was just one problem that he happened to look at. He spent most of the rest of his life working for the Weapon System Evaluation Group in Washington, WSEG, on military problems. Apparently he didn't care much as long as he could solve some interesting problems! [Some of these points, concerning Everett's life and motivations, and Wheeler's role in MW, are historically incorrect.]

Anyway, I didn't know about Everett's work so we discovered our interpretation independent of Everett. Now maybe Feynman knew about… about Everett's work and when he was commenting maybe he was drawing upon his knowledge of Everett, I have no idea, but… but certainly Felix and I didn't know about it, so we recreated something related to it.

Now, as interpreted by some people, Everett's work has two peculiar features: one is that this talk about many worlds and equally… many worlds equally real, which has confused a lot of people, including some very scholarly students of quantum mechanics. What does it mean, 'equally real'? It doesn't really have any useful meaning. What the people mean is that there are many histories of the… many alternative histories of the universe, many alternative course-grained, decoherent histories of the universe, and the theory treats them all on an equal footing, except for their probabilities. Now if that's what you mean by equally real, okay, but that's all it means; that the theory treats them on an equal footing apart from their probabilities. Which one actually happens in our experience, is a different matter and it's determined only probabilistically. Anyway, there's considerable continuity between the thoughts of '63-'64 and the thoughts that, and… and maybe earlier in the ‘60s, and the thoughts that Jim Hartle and I have had more recently, starting around '84-'85.
Indeed, Feynman was familiar with Everett's work -- see here and here.

Where Murray says "it's determined only probabilistically" I would say there is a subjective probability which describes how surprised one is to find oneself on a particular decoherent branch or history of the overall wavefunction -- i.e., how likely or unlikely we regard the outcomes we have observed to have been. For more see here.

Murray against Copenhagen:
... although the so-called Copenhagen interpretation is perfectly correct for all laboratory physics, laboratory experiments and so on, it's too special otherwise to be fundamental and it sort of strains credulity. It's… it’s not a convincing fundamental presentation, correct though… though it is, and as far as quantum cosmology is concerned it's hopeless. We were just saying, we were just quoting that old saw: describe the universe and give three examples. Well, to apply the… the Copenhagen interpretation to quantum cosmology,  you'd need a physicist outside the universe making repeated experiments, preferably on multiple copies of the universe and so on and so on. It's absurd. Clearly there is a definition to things happening independent of human observers. So I think that as this point of view is perfected it should be included in… in teaching fairly early, so that students aren't convinced that in order to understand quantum mechanics deeply they have to swallow some of this…very… some of these things that are very difficult to believe. But in the end of course, one can use the Copenhagen interpretations perfectly okay for experiments.

Tuesday, August 07, 2012

Quantum correspondence

I've been corresponding with a German theoretical physicist ("R") recently about quantum mechanics and thought I would share some of it here.

[R] Dear Prof.Hsu: I enjoyed reading your recent, very clearly written paper On the origin of probability in quantum mechanics very much. I discussed its subject matter oftentimes with Hans-Dieter Zeh ... We both think that many worlds is an idea that is probably true in some sense.
[ME] I have corresponded with Dieter over the years and read most (all?) of his work in this area. I would say we do not really disagree about anything.

To me many worlds (MW)  is very appealing and should really be considered the "minimal" interpretation of QM since I do not know of any other logically complete interpretations.

However, anyone who endorses MW should think very carefully about the origin of probability. Since MW is really a deterministic theory (at least from the viewpoint of a "global" observer not subject to decoherence), the only kind of probabilities it allows are subjective ones.

It is disturbing to me that most versions of me in the multiverse do not believe in the Born Rule (and probably then don't believe in QM!). MW proponents (e.g., Deutsch) would like to argue that, subjectively, I should not be "surprised" to be one of the few versions of me that see experimental verification of the Born Rule, but I am still uncomfortable about this. (The use of "most" above implies adopting a measure, and that is the root of all problems here.)

I hope this helps -- all I've done in the above paragraphs is recapitulate the paper you already read!
[ME] The "subjective" nature of probability is because the theory is actually deterministic. (Einstein would have liked it, except for the many branches in the wavefunction.)  
Let's suppose you live in a deterministic world and are about to flip a coin. You assign a probability to the outcome because you don't know what it will be. In secret, the outcome is already determined. To you, the process appears probabilistic, but really it is not. That is actually how MW works, but this is not widely appreciated. See esp. eqn 4 and figure in my paper.  
Copenhagen is not logically complete because it does not explain how QM applies to the particles in the observer (which is always treated classically). Collapse theories have different physical predictions than MW because collapse is not unitary.  
[R] Without going into the details, it seems absolutely clear to me that the main protagonists of Copenhagen, Heisenberg, Pauli, Bohr etc. did not believe that there is some explicit, QM-violating collapse mechanism. Do u agree? 
[ME] I can't read the minds of the ancients. The only clear formulation is that of von Neumann, and there a measurement outcome requires collapse = non-unitary projection. 
[R] A lack of free will is actually also the way out of Bell for Gerard (t'Hooft), and he convinced me that the idea is not so crazy at all. I don't know why this loophole got so little attention in Bell experiments. What is your take?

[ME] ... it is funny that everyone (physicists should know better) assumes a priori that we have free will. For example, the Free Will Theorem guys (admittedly, they are only mathematicians ;-) take it for granted.

... Strangely, not many people understand how MWI evades Bell without non-locality. There are a couple of papers on this but they are not well appreciated. Actually the result is kind of trivial. 
... MW has no problem with Bell's inequality because MW reproduces [see footnote #] the experimental predictions of the CI (Conventional or Copenhagen or Collapse Interpretation). An experimenter in a MW universe will not observe violation of Bell's inequality, or of the GHZ prediction, etc.  
Does this mean that MW avoids non-locality? That depends on what you mean by non-locality (I imagine this is relevant to your H-D anecdote). On the one hand the Hamiltonian is local and the evolution of Psi is deterministic, so from that perspective there is obviously nothing non-local going on:  Psi(x,t) only affects Psi(x',t') if (x',t') is in the forward lightcone of (x,t). From other perspectives one can speak of "non-local correlations" or influences, but I find this to be simply creating mystery where there is none.  
More succinctly, in a deterministic theory with a local evolution equation (Schrodinger equation with local Hamiltonian), there cannot be any non-locality. Just think about the wave equation.  
# The exception is macroscopic interference experiments as proposed by Deutsch that can tell the difference between reversible (unitary) and irreversible (collapse) theories. But these experiments are not yet technically feasible.  
[R] No sorry, I must think beyond "just the wave equation". I must think about "result of a measurement" when facing the Bell trouble.  
[ME] The great beauty of decoherence and MW is that it takes the mystery out of "measurement" and shows it to simply result from the unitary evolution of the wavefunction. There is no mystery and, indeed, everything is governed by a causal wave-like equation (Schrodinger equation). 
Rather than belabor this further I will refer you to more detailed treatments like the ones below:  
The EPR paradox, Bell’s inequality, and the question of locality, Am. J. Phys. 78 1 , January 2010.
[Reference 36] Our explanation of the many-worlds interpretation branching in the text follows similar descriptions by Don N. Page, “The Einstein–Podolsky–Rosen physical reality is completely described by quantum mechanics,” Phys. Lett. A 91, 57–60 (1982), [Inspec] [ISI] Michael Clive Price, “The Everett FAQ,”, and C. Hewitt-Horsman and V. Vedral, “Entanglement without nonlocality,” Phys. Rev. A 76, 062319-1–8 (2007).
... As I said, "non-locality" must be defined carefully. Even standard QFT can appear "non-local" to the foolish (positrons go backwards in time!). Recall that MW is the most "realistic" of all QM interpretations -- Psi contains all information (including about what is happening in a given mind, the process of measurement, etc.), and Psi evolves entirely causally in spacetime. So any mystery about this is manufactured. In the papers linked to above you can track exactly what happens in an EPR/Bell experiment in MW and see that everything is local; but the result is trivial from the beginning if you grasp the points I made above.

Wednesday, May 09, 2012

Entanglement and Decoherence

My preprint (which appeared yesterday evening on arxiv) has already elicited a response from the brilliant and eccentric Lubos Motl. Lubos believes in the "subjective" interpretation of the quantum state, so objects to the idea of a unitarily-evolving wavefunction describing all degrees of freedom in the universe. (See here for more discussion.) I, on the other hand, am willing to consider the possibility that many worlds is correct. Here is how Lubos characterizes the disagreement:
Buniy and Hsu also seem to be confused about the topics that have been covered hundreds of times on this blog. In particular, the right interpretation of the state is a subjective one. Consequently, all the properties of a state – e.g. its being entangled – are subjective as well. They depend on what the observer just knows at a given moment. Once he knows the detailed state of objects or observables, their previous entanglement becomes irrelevant. 
... When I read papers such as one by Buniy and Hsu, I constantly see the wrong assumption written everything in between the lines – and sometimes inside the lines – that the wave function is an objective wave and one may objectively discuss its properties. Moreover, they really deny that the state vector should be updated when an observable is changed. But that's exactly what you should do. The state vector is a collection of complex numbers that describe the probabilistic knowledge about a physical system available to an observer and when the observer measures an observable, the state instantly changes because the state is his knowledge and the knowledge changes!
In the section of our paper on Schmidt decomposition, we write
A measurement of subsystem A which determines it to be in state ψ^(n)_A implies that the rest of the universe must be in state ψ^(n)_B. For example, A might consist of a few spins [9]; it is interesting, and perhaps unexpected, that a measurement of these spins places the rest of the universe into a particular state ψ^(n)_B. As we will see below, in the cosmological context these modes are spread throughout the universe, mostly beyond our horizon. Because we do not have access to these modes, they do not necessarily prevent us from detecting A in a superposition of two or more of the ψ^(n)_A. However, if we had sufficient access to B degrees of freedom (for example, if the relevant information differentiating between ψ^(n)_A states is readily accessible in our local environment or in our memory records), then the A system would decohere into one of the ψ^(n)_A.
This discussion makes it clear that ψ describes all possible branches of the wavefunction, including those that may have already decohered from each other: it describes not just the subjective experience of one observer, but of all possible observers. If we insist on removing decohered branches from the wavefunction (e.g., via collapse or von Neumann projection), then much of the entanglement we discuss in the paper is also excised. However, if we only remove branches that are inconsistent with the observations of a specific single observer, most of it will remain. Note decoherence is a continuous and (in principle) reversible phenomenon, so (at least within a unitary framework) there is no point at which one can say two outcomes have entirely decohered -- one can merely cite the smallness of overlap between the two branches or the level of improbability of interference between them.

I don't think Lubos disagrees with the mathematical statements we make about the entanglement properties of ψ. He may claim that these entanglement properties are not subject to experimental test. At least in principle, one can test whether systems A and B, which are in two different horizon volumes at cosmological time t1, are entangled. We have to wait until some later time t2, when there has been enough time for classical communication between A and B, but otherwise the protocol for determining entanglement is the usual one.

If we leave aside cosmology and consider, for example, the atoms or photons in a box, the same formalism we employ shows that there is likely to be widespread entanglement among the particles. In principle, an experimentalist who is outside the box can test whether the state ψ describing the box is "typical" (i.e., highly entangled) by making very precise measurements.

See stackexchange for more discussion.

Tuesday, May 08, 2012

Everything is Entangled

This paper will be available tomorrow at the link. 
Everything is Entangled 
Roman V. Buniy, Stephen D.H. Hsu 
We show that big bang cosmology implies a high degree of entanglement of particles in the universe. In fact, a typical particle is entangled with many particles far outside our horizon. However, the entanglement is spread nearly uniformly so that two randomly chosen particles are unlikely to be directly entangled with each other -- the reduced density matrix describing any pair is likely to be separable.
From the introduction:
Ergodicity and properties of typical pure states 
When two particles interact, their quantum states generally become entangled. Further interaction with other particles spreads the entanglement far and wide. Subsequent local manipulations of separated particles cannot, in the absence of quantum communication, undo the entanglement. We know from big bang cosmology that our universe was in thermal equilibrium at early times, and we believe, due to the uniformity of the cosmic microwave background, that regions which today are out of causal contact were once in equilibrium with each other. Below we show that these simple observations allow us to characterize many aspects of cosmological entanglement. 
We will utilize the properties of typical pure states in quantum mechanics. These are states which dominate the Hilbert measure. The ergodic theorem proved by von Neumann implies that under Schrodinger evolution most systems spend almost all their time in typical states. Indeed, systems in thermal equilibrium have nearly maximal entropy and hence must be typical. Typical states are maximally entangled (see below) and the approach to equilibrium can be thought of in terms of the spread of entanglement. ...

Professor Buniy in action! (Working on this research.)

Tuesday, April 10, 2012

Bits, Branes, Black Holes

Greetings from Santa Barbara! I'm here at KITP for three weeks of Bits, Branes, Black Holes.

Theoretical physics action video and slides. (Opening session of the workshop: an overview of the black hole information problem. Highly recommended if you have the right background.)

Blog Archive