I’ve been working on my thesisEdit: this was my master’s thesis
over reading week, and I think I’ve finished my introduction to Landauer’s Principle. I ended up writing a pretty detailed derivation of the balance equation, and thus Landauer’s bound, so I thought it might be useful to post here.
Landauer's principle states that there is a minimal energetic cost for a state transformation ρi→ρf on a system S via the action of a thermal reservoir E at temperature (kBβ)−1kB≈1.38×10−23 Joules per Kelvin is Boltzmann's constant.
. In particular, if ΔSS is the change of entropy of the system S, and ΔQE is the change in energy of the reservoir E, then
ΔQE≥β−1ΔSS.(1)
This principle has generated interest since its inception in 1961; see [Section 1, RW14] [RW14]: An improved Landauer Principle with finite-size corrections, D. Reeb and M. Wolf, 2014 (v3).
for a recent summary. First, the bound has allusions to practicality: perhaps the energy efficiency of our computers will be limited. For changing the state of a classical or quantum bit however, the bound is at most
ΔQE≥kB⋅Tlog2≈(9.6×10−24J/K)⋅
which is extremely small for reasonable temperatures T; yet, modern processors are within several orders of magnitude of this limit, as shown in Figure 1.
Energy cost of changing state for modern silicon transistors, as compared to a theoretical minimum for classical bits encoded in electron charge at room temperature. Figure reproduced from Figure 1(a) of Energy dissipation and transport in nanoscale devices by Eric Pop (2010).
Moreover, in 1973 Bennett showed that any Turing machine program may be implemented in a reversible manner, so that ΔSS=0. Reversible computing is an area of considerable practical interest and continuing theoretical work.
More fundamentally, Landauer's bound is a direct relationship between energy and information (entropy). From now on, we will use natural units so that kB=ℏ=1.
In fact, Landauer's principle follows from the entropy balance equationΔSS+σ=βΔQE(2)
where σ is the entropy production.
. We assume the system S is described by a finite dimensional Hilbert space HS, with self-adjoint Hamiltonian hS. The initial state on the system is given by a density matrixnon-negative trace-one operator on HS
ρi. Likewise, we assume the environment is described by a finite dimensional Hilbert space HE with self-adjoint Hamiltonian hE, and initial state
ξi=tr(exp(−βhE))exp(−βhE)(3)
the Gibbs stateGibbs states on E are invariant under the free dynamics hE; in this finite dimensional context, they are uniquely so. They thus have the interpretation of thermal equilibrium states.
at temperature β−1. The system and environment start uncoupled, so the joint initial state is ρi⊗ξi. The evolution of the joint system is given by a unitary operator U∈B(HS⊗HE), leading to the final joint state Uρi⊗ξiU∗. We decouple the systems, yielding
ρf=trE(Uρi⊗ξiU∗),ξf=trS(Uρi⊗ξiU∗)
as the final state on the system, environment, respectively.
We identify two quantities of interest during this process: ΔSS, the change of entropy of the system of interest, and ΔQE, the change of energy of the environment, defined as Note the sign convention.
ΔSS:=S(ρi)−S(ρf),ΔQE:=tr(hEξf)−tr(hEξi),
where S(ρ):=−trρlogρ is the von Neumann entropy. Recall the relative entropy S(η∣ν)=tr(ηlogη−logν)) of two faithful states η and ν has S(η∣ν)≥0 with equality if and only if η=ν. See sections 2.5-2.6 of Entropic Fluctuations in Quantum Statistical Mechanics. An Introduction by Jaksic et al (2011) for a review of entropy functions in finite dimensional quantum mechanics.
With this function, we define the entropy production
σ:=S(Uρi⊗ξiU∗∣ρf⊗ξi).
By definition of relative entropy, the entropy production may be written
σ=tr(Uρi⊗ξiU∗log(Uρi⊗ξiU∗))−tr(Uρi⊗ξiU∗log(ρf⊗ξi)).
We then recognize the first term as an entropy, and expand the second term using the following claim.
If A and B are positive (and thus self-adjoint) on a finite dimensional Hilbert space, then
log(A⊗B)=log(A)⊗id+id⊗log(B).
If A,B have spectral decompositions A=∑iμiPi and B=∑jλjQj, then A⊗B=∑ijμiλjPi⊗Qj. With this,
Since entropy is invariant under a unitary transformationAs may immediately be seen by the spectral theorem: if ρ has spectral decomposition ρ=∑iμiPi, then S(ρ)=−∑iμilogμi=S(UρU∗), since eigenvalues are invariant under unitary transformations (change of basis).
, we have S(Uρi⊗ξiU∗)=S(ρi⊗ξi). Furthermore, by definition of the partial trace,
tr(Uρi⊗ξiU∗(logρf⊗id))=tr(trS(Uρi⊗ξiU∗)logρf),
which is simply tr(ρflogρf)=−S(ρf). Using this argument for the third term as well, we are left with
σ=−S(ρi⊗ξi)+S(ρf)−tr(ξflogξi).
But
S(ρi⊗ξi)=−tr(ρi⊗ξilog(ρi⊗ξi))=−tr(ρi⊗ξi(logρi⊗id))−tr(ρi⊗ξi(id⊗logξi))=−tr(ρilogρi⊗ξi)−tr(ρi⊗ξilogξi)=−tr(ρilogρi)tr(ξi)−tr(ξilogξi)tr(ρi)=S(ρi)+S(ξi),
using that tr(ρi)=tr(ξi)=1. Then,
σ=−S(ρi)+S(ρf)−S(ξi)−tr(ξflogξi)=−ΔSS−tr((ξf−ξi)logξi).
Additionally, using (eq. 3),
logξi=−βhE−log(tr(exp(−βhE)),
so
σ=−ΔSS+βtr((ξf−ξi)hE)+log(tr(exp(−βhE)))tr(ξf−ξi)=−ΔSS+βΔQE,
using that tr(ξf−ξi)=1−1=0.
We thus have the balance equation (eq. 2). Landauer’s Principle (eq. 1) follows by simply noting σ≥0 as it is a relative entropy.
We may interpret (eq. 2) as a microscopic Clausius formulation of the Second Law of Thermodynamics [BHN+14] [BHN+14]: The second laws of quantum thermodynamics F. Brandao et al, 2013 (v4).
. More specifically, we may interpret βΔQE=∫ifTdQE=ΔSEClausius as the Clausius entropy change of the environment. Note: We’re making an analogy to the Clausius formulation of the Second Law of Thermodynamics, but not a formal relationship. The balance equation presented here and the 2nd law are part of two different frameworks.
Then, with a minus sign to account for our sign convention, we will interpret ΔSSClausius=−ΔSS, and the Second Law is
ΔSEClausius+ΔSSClausius=entropy production≥0.
In this language then, σ serves as the entropy production, which gives it its name. The classical Second Law, however, is a statement about macroscopic quantities obtained from the behavior of ≳1023 particles. Within the theory of quantum mechanics and our assumptions, however, the balance equation (eq. 2) is exact on a microscopic level.