Saturday, May 09, 2020

Pure State Quantum Thermalization: from von Neumann to the Lab


Perhaps the most fundamental question in thermodynamics and statistical mechanics is: Why do systems tend to evolve toward thermal equilibrium? Equivalently, why does entropy tend to increase? Because Nature is quantum mechanical, a satisfactory answer to this question has to arise within quantum mechanics itself. The answer was given already in a 1929 paper by von Neumann. However, the ideas were not absorbed (were in fact misunderstood) by the physics community and only rediscovered in the 21st century! General awareness of these results is still rather limited.

See this 2011 post: Classics on the arxiv: von Neumann and the foundations of quantum statistical mechanics.

In modern language, we would say something to the effect that "typical" quantum pure states are highly entangled, and the density matrix describing any small sub-system (obtained by tracing over the rest of the pure state) is very close to micro-canonical (i.e., thermal). Under dynamical (Schrodinger) evolution, all systems (even those that are initially far from typical) spend nearly all of their time in a typical state (modulo some weak conditions on the Hamiltonian). Typicality of states is related to concentration of measure in high dimensional Hilbert spaces. One could even claim that the origin of thermodynamics lies in the geometry of Hilbert space itself.

[ It's worth noting that vN's paper does more than just demonstrate these results. It also gives an explicit construction of macroscopic classical (commuting) observables arising in a large Hilbert space. This construction would be a nice thing to include in textbooks for students trying to connect the classical and quantum worlds. ]

Recently I came across an experimental realization of these theoretical results, using cold atoms in an optical lattice (Greiner lab at Harvard):
Quantum thermalization through entanglement in an isolated many-body system

Science 353, 794-800 (2016)    arXiv:1603.04409v3

The concept of entropy is fundamental to thermalization, yet appears at odds with basic principles in quantum mechanics. Statistical mechanics relies on the maximization of entropy for a system at thermal equilibrium. However, an isolated many-body system initialized in a pure state will remain pure during Schrodinger evolution, and in this sense has static, zero entropy. The underlying role of quantum mechanics in many-body physics is then seemingly antithetical to the success of statistical mechanics in a large variety of systems. Here we experimentally study the emergence of statistical mechanics in a quantum state, and observe the fundamental role of quantum entanglement in facilitating this emergence. We perform microscopy on an evolving quantum system, and we see thermalization occur on a local scale, while we measure that the full quantum state remains pure. We directly measure entanglement entropy and observe how it assumes the role of the thermal entropy in thermalization. Although the full state remains measurably pure, entanglement creates local entropy that validates the use of statistical physics for local observables. In combination with number-resolved, single-site imaging, we demonstrate how our measurements of a pure quantum state agree with the Eigenstate Thermalization Hypothesis and thermal ensembles in the presence of a near-volume law in the entanglement entropy.
Note, given the original vN results I think the Eigenstate Thermalization Hypothesis is only of limited interest. [ But see comments for more discussion... ] The point is that this is a laboratory demonstration of pure state thermalization, anticipated in 1929 by vN.

Another aspect of quantum thermalization that is still not very well appreciated is that approach to equilibrium can have a very different character than what students are taught in statistical mechanics. The physical picture behind the Boltzmann equation is semi-classical: collisions between atoms happen in serial as two gases equilibrate. But Schrodinger evolution of the pure state (all the degrees of freedom together) toward typicality can take advantage of quantum parallelism: all possible collisions take place on different parts of the quantum superposition state. Consequently, the timescale for quantum thermalization can be much shorter than in the semi-classical Boltzmann description.

In 2015 my postdoc C.M. Ho (now director of an AI lab in Silicon Valley) and I pointed out that quantum thermalization was likely already realized in heavy ion collisions at RHIC and CERN, and that the quantum nature of the process was responsible for the surprisingly short time required to approach equilibrium (equivalently, to generate large amounts of entanglement entropy).

Entanglement and fast thermalization in heavy ion collisions (see also slides here).


Entanglement and Fast Quantum Thermalization in Heavy Ion Collisions (arXiv:1506.03696)

Chiu Man Ho, Stephen D. H. Hsu

Let A be subsystem of a larger system A∪B, and ψ be a typical state from the subspace of the Hilbert space H_AB satisfying an energy constraint. Then ρ_A(ψ)=Tr_B |ψ⟩⟨ψ| is nearly thermal. We discuss how this observation is related to fast thermalization of the central region (≈A) in heavy ion collisions, where B represents other degrees of freedom (soft modes, hard jets, co-linear particles) outside of A. Entanglement between the modes in A and B plays a central role; the entanglement entropy S_A increases rapidly in the collision. In gauge-gravity duality, S_A is related to the area of extremal surfaces in the bulk, which can be studied using gravitational duals.



An earlier blog post Ulam on physical intuition and visualization mentioned the difference between intuition for familiar semiclassical (incoherent) particle phenomena, versus for intrinsically quantum mechanical (coherent) phenomena such as the spread of entanglement and its relation to thermalization.
[Ulam:] ... Most of the physics at Los Alamos could be reduced to the study of assemblies of particles interacting with each other, hitting each other, scattering, sometimes giving rise to new particles. Strangely enough, the actual working problems did not involve much of the mathematical apparatus of quantum theory although it lay at the base of the phenomena, but rather dynamics of a more classical kind—kinematics, statistical mechanics, large-scale motion problems, hydrodynamics, behavior of radiation, and the like. In fact, compared to quantum theory the project work was like applied mathematics as compared with abstract mathematics. If one is good at solving differential equations or using asymptotic series, one need not necessarily know the foundations of function space language. It is needed for a more fundamental understanding, of course. In the same way, quantum theory is necessary in many instances to explain the data and to explain the values of cross sections. But it was not crucial, once one understood the ideas and then the facts of events involving neutrons reacting with other nuclei.
This "dynamics of a more classical kind" did not require intuition for entanglement or high dimensional Hilbert spaces. But see von Neumann and the foundations of quantum statistical mechanics for examples of the latter.

No comments:

Post a Comment