Entropy is a function of the state of a thermodynamic system. It is a size-extensive[1] quantity, invariably denoted by S, with dimension energy divided by absolute temperature (SI unit: joule/K). Entropy has no analogous mechanical meaning—unlike volume, a similar size-extensive state parameter. Moreover entropy cannot be measured directly, there is no such thing as an entropy meter, whereas state parameters like volume and temperature are easily determined. Consequently entropy is one of the least understood concepts in physics.[2]
Entropy (as the extensive property mentioned above) has corresponding intensive (size-independent) properties for pure materials. A corresponding intensive property is specific entropy, which is entropy per mass of substance involved. Specific entropy is denoted by a lower case s, with dimension of energy per absolute temperature and mass [SI unit: joule/(K·kg)]. If a molecular mass or number of moles involved can be assigned, then another corresponding intensive property is molar entropy, which is entropy per mole of the compound involved, or alternatively specific entropy times molecular mass. There is no universally agreed upon symbol for molar properties, and molar entropy has been at times confusingly symbolized by S, as in extensive entropy. The dimensions of molar entropy are energy per absolute temperature and number of moles [SI unit: joule/(K·mole)].
The state variable "entropy" was introduced by Rudolf Clausius in 1865,[3] see the inset for his text, when he gave a mathematical formulation of the second law of thermodynamics.
The traditional way of introducing entropy is by means of a Carnot engine, an abstract engine conceived of by Sadi Carnot in 1824[4] as an idealization of a steam engine. Carnot's work foreshadowed the second law of thermodynamics. The "engineering" manner—by an engine—of introducing entropy will be discussed below. In this approach, entropy is the amount of heat (per degree kelvin) gained or lost by a thermodynamic system that makes a transition from one state to another. The second law states that the entropy of an isolated system increases in spontaneous (natural) processes leading from one state to another, whereas the first law states that the internal energy of the system is conserved.
In 1877 Ludwig Boltzmann[5] gave a definition of entropy in the context of the kinetic gas theory, a branch of physics that developed into statistical thermodynamics. Boltzmann's definition of entropy was furthered by John von Neumann[6] to a quantum statistical definition. The quantum statistical point of view, too, will be reviewed in the present article. In the statistical approach the entropy of an isolated (constant energy) system is kB logΩ, where kB is Boltzmann's constant and the function log stands for the natural (base e) logarithm. Ω is the number of different wave functions ("microstates") of the system belonging to the system's "macrostate" (thermodynamic state). The number Ω is the multiplicity of the macrostate; for an isolated system, where the macrostate is of definite energy, Ω is its degeneracy. For a system of about 1023 particles, Ω is on the order of 101023, that is the entropy is on the order of 1023×kB ≈ R, the molar gas constant.
Not satisfied with the engineering type of argument, the mathematician Constantin Carathéodory gave in 1909 a new axiomatic formulation of entropy and the second law of thermodynamics.[7] His theory was based on Pfaffian differential equations. His axiom replaced the earlier Kelvin-Planck and the equivalent Clausius formulation of the second law and did not need Carnot engines. Carathéodory's work was taken up by Max Born,[8] and it is treated in a few monographs.[9][10] [11] Since it requires more mathematical knowledge than the traditional approach based on Carnot engines, and since this mathematical knowledge is not needed by most students of thermodynamics, the traditional approach, which depends on some ingenious thought experiments, is still dominant in the majority of introductory works on thermodynamics.
The state (a point in state space) of a thermodynamic system is characterized by a number of variables, such as pressure p, temperature T, amount of substance n, volume V, etc. Any thermodynamic parameter can be seen as a function of an arbitrary independent set of other thermodynamic variables, hence the terms "property", "parameter", "variable" and "function" are used interchangeably. The number of independent thermodynamic variables of a system is equal to the number of energy contacts of the system with its surroundings.
An example of a reversible (quasi-static) energy contact is offered by the prototype thermodynamical system, a gas-filled cylinder with piston. Such a cylinder can perform work on its surroundings,
where dV stands for a small increment of the volume V of the cylinder, p is the pressure inside the cylinder and DW stands for a small amount of work, not necessarily a differential of a function; such differential is often referred to as inexact and indicated by a capital D, instead of d.[11] Work by expansion is a form of energy contact between the cylinder and its surroundings. This process can be reverted, the volume of the cylinder can be decreased, the gas is compressed and the surroundings perform work DW = pdV < 0 on the cylinder.
When the inexact differential DW is divided by p, the quantity DW/p becomes obviously equal to the differential dV of the differentiable state function V. State functions depend only on the actual values of the thermodynamic parameters (they depend on a single point in state space, a state function is local in state space). A state function does not depend on the points on the path along which the state was reached (the history of the state). Mathematically this means that integration from point 1 to point 2 along path I in state space is equal to integration along a different path II,
The amount of work (divided by p) performed reversibly along path I is equal to the amount of work (divided by p) along path II. This condition is necessary and sufficient that DW/p is the differential of a state function. So, although DW is not a differential, the quotient DW/p is one.
Reversible absorption of a small amount of heat DQ is another energy contact of a system with its surroundings; DQ is again not a differential of a certain function. In a completely analogous manner to DW/p, the following result can be shown for the heat DQ (divided by T) absorbed reversibly by the system along two different paths (along both paths the absorption is reversible):
(1)
Hence the quantity dS defined by
is the differential of a state variable S, the entropy of the system. In the next subsection equation (1) will be proved from the Kelvin-Planck principle. Observe that this definition of entropy only fixes entropy differences:
Note further that entropy has the dimension energy per degree temperature (joule per degree kelvin) and recalling the first law of thermodynamics (the differential dU of the internal energy satisfies dU = DQ − DW), it follows that
(For convenience sake only a single work term was considered here, namely DW = pdV, work done by the system). The internal energy is an extensive quantity. The temperature T is an intensive property, independent of the size of the system. It follows that the entropy S is an extensive property. In that sense the entropy resembles the volume of the system. We reiterate that volume is a state function with a well-defined mechanical meaning, whereas entropy is introduced by analogy and is not easily visualized. Indeed, as is shown in the next subsection, it requires a fairly elaborate reasoning to prove that S is a state function, i.e., that equation (1) holds.
Equation (1) gives the sufficient condition that the entropy S is a state function. The standard proof of equation (1), as given now, is physical, by means of an engine making Carnot cycles, and is based on the Kelvin-Planck formulation of the second law of thermodynamics.
Consider the figure. A system, consisting of an arbitrary closed system C (only heat goes in and out) and a reversible heat engine E, is coupled to a large heat reservoir R of constant temperature T0. The system C undergoes a cyclic state change 1-2-1. Since no work is performed on or by C, it follows that
For the heat engine E it holds (by the definition of thermodynamic temperature) that
Hence
From the Kelvin-Planck principle it follows that W is necessarily less or equal zero, because there is only the single heat source R from which W is extracted. Invoking the first law of thermodynamics we get,
so that
Because the processes inside C and E are assumed reversible, all arrows can be reverted and in the very same way it is shown that
so that equation (1) holds (with a slight change of notation, subscripts are transferred to the respective integral signs):
The definition of Gibbs free energy is based on entropy as follows:
where all the thermodynamic properties except T are extensive and where
A corresponding equation with all intensive properties (i.e., per unit of mass) can be written as follows:
where
The equation of state of one mole of an ideal gas is
where R is the molar gas constant, p the pressure, and V the volume of the gas. Note that the limit T → 0 implies V → 0—ideal-gas particles are of zero size.
The entropy of one mole of an ideal gas is a function of T and V and depends parametrically on the molar gas constant R and the molar heat capacity at constant volume, CV,
where S0 is a constant independent of T, V, and p. From statistical thermodynamics it is known that for an atomic ideal gas CV = 3R/2, so that the exponent of T becomes 3/2. For a diatomic ideal gas CV = 5R/2 and for an ideal gas of arbitrarily shaped molecules CV = 3R. In any case, for an ideal gas CV is constant, independent of T, V, or p.
The expression for the ideal gas entropy is derived easily by substituting the ideal gas law (E1) into the following general differential equation for the entropy as function of T and V—valid for any thermodynamic system,
Integration gives
Write
and the result follows.
The proof of the differential equation (E2) follows by some typical classical thermodynamics calculus.
First, the internal energy at constant volume follows thus,
The definition of heat capacity and the first law (DQ = dU+pdV, for constant volume: DQ=dU) give,
so that the heat capacity at constant volume is given by
The first and second law combined (TdS=dU+pdV) gives
From,
and
and
follows
Substitute the very last equation into equation (E3), and the equation to be proved follows,
In classical (phenomenological) thermodynamics it is not necessary to assume that matter consists of small particles (atoms or molecules). While this has the advantage of keeping classical thermodynamics transparent, not obscured by microscopic details, and universally valid, independent of the kind of molecules constituting the system, it has the disadvantage that it cannot predict the value of any parameters. For instance, the heat capacity of a monatomic ideal gas at constant volume CV is equal to 3R/2, where R is the molar gas constant. One needs a microscopic theory to find this simple result.
Before the 1920s the microscopic (molecular) theory of thermodynamics was based on classical (Newtonian) mechanics and on the kind of statistical arguments that were first introduced into physics by Maxwell and developed by Gibbs and Boltzmann. The branch of physics that tries to predict thermodynamic properties departing from molecular properties is known as statistical thermodynamics or statistical mechanics. Since the 1920s statistical thermodynamics is based usually on quantum mechanics.
In this section it will be shown that the statistical mechanics expression for the entropy is
where the density operator is given by
Further kB is Boltzmann's constant, is the quantum mechanical energy operator of the total system (the energies of all particles plus their interactions), and the trace (Tr) of an operator is the sum of its diagonal matrix elements.
It will also be shown under which circumstance the entropy may be given by Boltzmann's celebrated equation
In his book[6]John von Neumann introduced into quantum mechanics the density operator (called "statistical operator" by von Neumann) for a system of which the state is only partially known. He considered the situation that certain real numbers pm are known that correspond to a complete set of orthonormal quantum mechanical states | m ⟩ (m = 0, 1, 2, …, ∞).[13] The quantity pm is the probability that state |m⟩ is occupied, or in other words, it is the percentage of systems in a (very large) ensemble of identical systems that are in the state |m⟩. As is usual for probabilities, they are normalized to unity,
The averaged value of a property with quantum mechanical operator of a system described by the probabilities pm is given by the ensemble average,
where is the usual quantum mechanical expectation value.
The expression for ⟨⟨P ⟩⟩ can be written as a trace of an operator product. First define the density operator;
then it follows that
Indeed,
where ⟨ m | n ⟩ = δmn, the Kronecker delta.
A density operator has unit trace
For a thermodynamic system of constant temperature (T), volume (V), and number of particles (N),one considers eigenstates of the energy operator , the Hamiltonian of the total system,
Assume that pm is proportional to the Boltzmann factor, with the proportionality constant K determined by normalization,
where kB is the Boltzmann constant. It is common to designate the partition function of the system of constant T, N, and V by Q,
Hence, using that
it is found
where it used that the set of states is complete—give rise to the following resolution of the identity operator,
In summary, the canonical ensemble[14] average of a property with quantum mechanical operator is given by
The quantum statistical expression for internal energy is
From
follows
The quantum statistical expression for the internal energy U becomes
where it is used that a scalar may be taken of the trace and that the density operator is of unit trace.
In classical thermodynamics the internal energy is related to the entropy S and the Helmholtz free energy A by
Define
and accordingly
and
In summary,
which agrees with the quantum statistical expression for U, which in turn means that the definitions (S1) of the entropy operator and Helmholtz free energy operator are consistent.
Note that neither the entropy nor the free energy are given by an ordinary quantum mechanical operator, both depend on the temperature through the partition function Q. Furthermore Q is defined as a trace:
and thus samples the whole (Hilbert) space containing the state vectors | m ⟩. Almost all quantum mechanical operators that represent observable (physical) quantities have a classical (electromagnetic or mechanical) counterpart. Clearly the entropy operator lacks such a parallel definition, and this is probably the main reason why entropy is a concept that is difficult to comprehend
Let us consider an isolated system (constant U, V, and N). Traces are taken only over states with energy U. Let there be Ω(U, V, N) of these states. This is in general a very large number, for instance for one mole of a monatomic ideal gas consisting of N = NA ≈ 1023 (Avogadro's number) it holds that[15]
Here m is the mass of an atom, h is Planck's constant, V is the volume of the vessel containing the gas, and e ≈ 2.7.
The sum in the partition function shrinks to a sum over Ω states of energy U, hence
Likewise,
so that Boltzmann's celebrated equation follows[12]
From the previous expression for Ω follows an expression for the entropy of a monatomic ideal gas as a function of T and V,
Recalling that NAkB ≡ R and CV = 3/2 R one sees that this is the formula encountered above [between Eqs. (E1) and (E2)], but this time with an explicit expression for S0.
Boltzmann's equation is derived as an average over an ensemble consisting of identical systems of constant energy, number of particles, and volume; such an ensemble is known as a microcanonical ensemble. However, it can be shown that energy fluctuations around the mean energy in a canonical ensemble (constant T) are extremely small, so that taking the trace over only the states of mean energy is a very good approximation. In other words, although Boltzmann's formula does not hold formally for a canonical ensemble, in practice it is a very good approximation, also for isothermal systems.
In common parlance the term entropy is used for lack of order and gradual decline into disorder. One can find in many introductory physics texts the statement that entropy is a measure for the degree of randomness in a system.
The origin of these statements is Boltzmann's 1877 equation S=kB logΩ that was discussed above. The third law of thermodynamics states the following: when T → 0 the number of accessible states Ω goes to unity, and the entropy S goes to zero. That is, if one interprets entropy as randomness, then at zero K there is no disorder whatsoever, matter is in complete order. Clearly, this low-temperature limit supports the intuitive notion of entropy as a measure of chaos.
It was shown above that Ω gives the number of quantum states accessible to a system. It can be argued that the more quantum states are available to a system, the greater the complexity of the system. If one equates complexity with randomness, as is often done in this context, it confirms the notion of entropy as a measure of disorder. The second law of thermodynamics, which states that a spontaneous process in an isolated system strives toward maximum entropy, can be interpreted as the tendency of the universe to become more and more chaotic.
However, the view of entropy as disorder, as a measure of chaos, is disputed. For instance, Lambert[16] contends that entropy is a "measure for energy dispersal". If one reads "energy dispersal" as heat divided by temperature, this is true by the classical (phenomenological) definition of entropy. Lambert states that from a molecular point of view, entropy increases when more microstates become available to the system (i.e., Ω increases) and the energy is dispersed over the greater number of accessible microstates. This interpretation agrees with the discussion above. Lambert argues further that the view of entropy as disorder, is "so misleading as actually to be a failure-prone crutch".
If one rejects completely the idea of entropy as randomness, one discards a convenient mnemonic device. Generations of physicists and chemists have remembered that a gas contains more entropy than a crystal, "because a gas is more chaotic than a crystal". This is easier to remember than "because the gas has more microstates to its disposal and its energy is dispersed over these larger number of microstates", although the latter statement is the more correct one.
As just stated, the entropy of a mole of pure substance changes as follows
which agrees with our intuition that a gas is more chaotic than a liquid, which again is more chaotic than a solid.
As an illustration of this point, consider one mole of water (H2O) at a pressure of 1 bar (≈ 1 atmosphere). Experimentally, the enthalpy of fusion ΔHf is 6.01 kJ/mol and the enthalpy of vaporization ΔHv is 40.72 kJ/mol. Remember that enthalpy is heat added/extracted reversibly at constant pressure (in this case 1 bar) to achieve the change of aggregation state. Further the change of aggregation state occurs at constant temperature, so that
For water Tf = 0 °C = 273.15 K and Tv = 100 °C = 373.15 K. Hence
Summarizing, in units J/(mol K) a mole of liquid water contains 22.0 more entropy than a mole of ice (both at 0 °C); a mole of gas (steam at 100 °C) contains 109.1 more entropy than a mole of liquid water at boiling temperature.