In thermodynamics, entropy is a measure of
the number of specific ways in which a thermodynamic
system may be arranged, commonly understood
as a measure of disorder. According to the
second law of thermodynamics the entropy of
an isolated system never decreases; such systems
spontaneously evolve towards thermodynamic
equilibrium, the configuration with maximum
entropy. Systems which are not isolated may
decrease in entropy. Since entropy is a state
function, the change in the entropy of a system
is the same for any process going from a given
initial state to a given final state, whether
the process is reversible or irreversible.
However irreversible processes increase the
combined entropy of the system and its environment.
The change in entropy was originally defined
for a thermodynamically reversible process
as
,
which is found from the uniform thermodynamic
temperature of a closed system dividing an
incremental reversible transfer of heat into
that system. The above definition is sometimes
called the macroscopic definition of entropy
because it can be used without regard to any
microscopic picture of the contents of a system.
In thermodynamics, entropy has been found
to be more generally useful and it has several
other formulations. Entropy was discovered
when it was noticed to be a quantity that
behaves as a function of state, as a consequence
of the second law of thermodynamics. Entropy
is an extensive property, but the entropy
of a pure substance is usually given as an
intensive property — either specific entropy
or molar entropy.
The absolute entropy was defined later, using
either statistical mechanics or the third
law of thermodynamics.
In the modern microscopic interpretation of
entropy in statistical mechanics, entropy
is the amount of additional information needed
to specify the exact physical state of a system,
given its thermodynamic specification. Understanding
the role of thermodynamic entropy in various
processes requires understanding how and why
that information changes as the system evolves
from its initial condition. It is often said
that entropy is an expression of the disorder,
or randomness of a system, or of our lack
of information about it. The second law is
now often seen as an expression of the fundamental
postulate of statistical mechanics via the
modern definition of entropy. Entropy has
the dimension of energy divided by temperature,
which has a unit of joules per kelvin in the
International System of Units.
History
The analysis which led to the concept of entropy
began with the work of French mathematician
Lazare Carnot who in his 1803 paper Fundamental
Principles of Equilibrium and Movement proposed
that in any machine the accelerations and
shocks of the moving parts represent losses
of moment of activity. In other words, in
any natural process there exists an inherent
tendency towards the dissipation of useful
energy. Building on this work, in 1824 Lazare's
son Sadi Carnot published Reflections on the
Motive Power of Fire which posited that in
all heat-engines whenever "caloric", or what
is now known as heat, falls through a temperature
difference, work or motive power can be produced
from the actions of the "fall of caloric"
between a hot and cold body. He made the analogy
with that of how water falls in a water wheel.
This was an early insight into the second
law of thermodynamics. Carnot based his views
of heat partially on the early 18th century
"Newtonian hypothesis" that both heat and
light were types of indestructible forms of
matter, which are attracted and repelled by
other matter, and partially on the contemporary
views of Count Rumford who showed that heat
could be created by friction as when cannon
bores are machined. Carnot reasoned that if
the body of the working substance, such as
a body of steam, is returned to its original
state at the end of a complete engine cycle,
that "no change occurs in the condition of
the working body".
The first law of thermodynamics, formalized
based on the heat-friction experiments of
James Joule in 1843, deals with the concept
of energy, which is conserved in all processes;
the first law, however, is unable to quantify
the effects of friction and dissipation.
In the 1850s and 1860s, German physicist Rudolf
Clausius objected to the supposition that
no change occurs in the working body, and
gave this "change" a mathematical interpretation
by questioning the nature of the inherent
loss of usable heat when work is done, e.g.
heat produced by friction. Clausius described
entropy as the transformation-content, i.e.
dissipative energy use, of a thermodynamic
system or working body of chemical species
during a change of state. This was in contrast
to earlier views, based on the theories of
Isaac Newton, that heat was an indestructible
particle that had mass.
Later, scientists such as Ludwig Boltzmann,
Josiah Willard Gibbs, and James Clerk Maxwell
gave entropy a statistical basis. In 1877
Boltzmann visualized a probabilistic way to
measure the entropy of an ensemble of ideal
gas particles, in which he defined entropy
to be proportional to the logarithm of the
number of microstates such a gas could occupy.
Henceforth, the essential problem in statistical
thermodynamics, i.e. according to Erwin Schrödinger,
has been to determine the distribution of
a given amount of energy E over N identical
systems. Carathéodory linked entropy with
a mathematical definition of irreversibility,
in terms of trajectories and integrability.
Definitions and descriptions
There are two related definitions of entropy:
the thermodynamic definition and the statistical
mechanics definition. Historically, the classical
thermodynamics definition developed first,
and it has more recently been extended in
the area of non-equilibrium thermodynamics.
Entropy was defined from a classical thermodynamics
viewpoint, in which the details of the system's
constituents are not directly considered,
with their behavior only showing up in macroscopically
averaged properties, e.g. heat capacity. Later,
thermodynamic entropy was more generally defined
from a statistical thermodynamics viewpoint,
in which the detailed constituents — modeled
at first classically, e.g. Newtonian particles
constituting a gas, and later quantum-mechanically
— were explicitly considered.
Function of state
There are many thermodynamic properties that
are functions of state. This means that at
a particular thermodynamic state, these properties
have a certain value. Often, if two properties
have a particular value, then the state is
determined and the other properties' values
are set. For instance, an ideal gas, at a
particular temperature and pressure, has a
particular volume according to the ideal gas
equation. As another instance, a pure substance
of single phase at a particular uniform temperature
and pressure is at not only a particular volume
but also at a particular entropy. That entropy
is a function of state is one reason it is
useful. In the Carnot cycle, the working fluid
returns to the same state at a particular
stage of the cycle, hence the line integral
of any state function, such as entropy, over
the cycle is zero.
Reversible process
Entropy is defined for a reversible process
and for a system that, at all times, can be
treated as being at a uniform state and thus
at a uniform temperature. Reversibility is
an ideal that some real processes approximate
and that is often presented in study exercises.
For a reversible process, entropy behaves
as a conserved quantity and no change occurs
in total entropy. More specifically, total
entropy is conserved in a reversible process
and not conserved in an irreversible process.
One has to be careful about system boundaries.
For example, in the Carnot cycle, while the
heat flow from the hot reservoir to the cold
reservoir represents an increase in entropy,
the work output, if reversibly and perfectly
stored in some energy storage mechanism, represents
a decrease in entropy that could be used to
operate the heat engine in reverse and return
to the previous state, thus the total entropy
change is still zero at all times if the entire
process is reversible. Any process that does
not meet the requirements of a reversible
process must be treated as an irreversible
process, which is usually a complex task.
An irreversible process increases entropy.
Heat transfer situations require two or more
non-isolated systems in thermal contact. In
irreversible heat transfer, heat energy is
irreversibly transferred from the higher temperature
system to the lower temperature system, and
the combined entropy of the systems increases.
Each system, by definition, must have its
own absolute temperature applicable within
all areas in each respective system in order
to calculate the entropy transfer. Thus, when
a system at higher temperature TH transfers
heat dQ to a system of lower temperature TC,
the former loses entropy dQ/TH and the latter
gains entropy dQ/TC. The combined entropy
change is dQTH which is positive, reflecting
an increase in the combined entropy. When
calculating entropy, the same requirement
of having an absolute temperature for each
system in thermal contact exchanging heat
also applies to the entropy change of an isolated
system having no thermal contact.
Carnot cycle
The concept of entropy arose from Rudolf Clausius's
study of the Carnot cycle. In a Carnot cycle,
heat is absorbed from a 'hot' reservoir, isothermally
at the higher temperature , and given up isothermally
to a 'cold' reservoir, , at a lower temperature,
. According to Carnot's principle, work can
only be done when there is a temperature difference,
and the work should be some function of the
difference in temperature and the heat absorbed.
Carnot did not distinguish between and , since
he was working under the incorrect hypothesis
that caloric theory was valid, and hence heat
was conserved when, in fact, . Through the
efforts of Clausius and Kelvin, it is now
known that the maximum work that can be done
is the product of the Carnot efficiency and
the heat absorbed at the hot reservoir: In
order to derive the Carnot efficiency, , Kelvin
had to evaluate the ratio of the work done
to the heat absorbed in the isothermal expansion
with the help of the Carnot-Clapeyron equation
which contained an unknown function, known
as the Carnot function. The fact that the
Carnot function could be the temperature,
measured from zero, was suggested by Joule
in a letter to Kelvin, and this allowed Kelvin
to establish his absolute temperature scale.
It is also known that the work is the difference
in the heat absorbed at the hot reservoir
and rejected at the cold one: Since the latter
is valid over the entire cycle, this gave
Clausius the hint that at each stage of the
cycle, work and heat would not be equal, but
rather their difference would be a state function
that would vanish upon completion of the cycle.
The state function was called the internal
energy and it became the first law of thermodynamics.
Now equating the two expressions gives
If we allow to incorporate the algebraic sign,
this becomes a sum and implies that there
is a function of state which is conserved
over a complete cycle. Clausius called this
state function entropy. One can see that entropy
was discovered through mathematics rather
than through laboratory results. It is a mathematical
construct and has no easy physical analogy.
This makes the concept somewhat obscure or
abstract, akin to how the concept of energy
arose.
Then Clausius asked what would happen if there
would be less work done than that predicted
by Carnot's principle. The right-hand side
of the first equation would be the upper bound
of the work, which would now be converted
into an inequality When the second equation
is used to express the work as a difference
in heats, we get or So more heat is given
off to the cold reservoir than in the Carnot
cycle. If we denote the entropies by for the
two states, then the above inequality can
be written as a decrease in the entropy . The
wasted heat implies that irreversible processes
must have prevented the cycle from carrying
out maximum work.
Classical thermodynamics
The thermodynamic definition was developed
in the early 1850s by Rudolf Clausius and
essentially describes how to measure the entropy
of an isolated system in thermodynamic equilibrium.
Clausius created the term entropy in 1865
as an extensive thermodynamic variable was
shown to be useful in characterizing the Carnot
cycle. Heat transfer along the isotherm steps
of the Carnot cycle was found to be proportional
to the temperature of a system. This relationship
was expressed in increments of entropy equal
to the ratio of incremental heat transfer
divided by temperature, which was found to
vary in the thermodynamic cycle but eventually
return to the same value at the end of every
cycle. Thus it was found to be a function
of state, specifically a thermodynamic state
of the system. Clausius wrote that he "intentionally
formed the word Entropy as similar as possible
to the word Energy", basing the term on the
Greek ἡ τροπή tropē, "transformation".
While Clausius based his definition on a reversible
process, there are also irreversible processes
that change entropy. Following the second
law of thermodynamics, entropy of an isolated
system always increases. The difference between
an isolated system and closed system is that
heat may not flow to and from an isolated
system, but heat flow to and from a closed
system is possible. Nevertheless, for both
closed and isolated systems, and indeed, also
in open systems, irreversible thermodynamics
processes may occur.
According to the Clausius equality, for a
reversible cyclic process: . This means the
line integral is path independent.
So we can define a state function S called
entropy, which satisfies:
With this we can only obtain the difference
of entropy by integrating the above formula.
To obtain the absolute value, we need the
Third Law of Thermodynamics, which states
that S=0 at absolute zero for perfect crystals.
From a macroscopic perspective, in classical
thermodynamics the entropy is interpreted
as a state function of a thermodynamic system:
that is, a property depending only on the
current state of the system, independent of
how that state came to be achieved. In any
process where the system gives up energy ΔE,
and its entropy falls by ΔS, a quantity at
least TR ΔS of that energy must be given
up to the system's surroundings as unusable
heat. Otherwise the process will not go forward.
In classical thermodynamics, the entropy of
a system is defined only if it is in thermodynamic
equilibrium.
Statistical mechanics
The statistical definition was developed by
Ludwig Boltzmann in the 1870s by analyzing
the statistical behavior of the microscopic
components of the system. Boltzmann showed
that this definition of entropy was equivalent
to the thermodynamic entropy to within a constant
number which has since been known as Boltzmann's
constant. In summary, the thermodynamic definition
of entropy provides the experimental definition
of entropy, while the statistical definition
of entropy extends the concept, providing
an explanation and a deeper understanding
of its nature.
The interpretation of entropy in statistical
mechanics is the measure of uncertainty, or
mixedupness in the phrase of Gibbs, which
remains about a system after its observable
macroscopic properties, such as temperature,
pressure and volume, have been taken into
account. For a given set of macroscopic variables,
the entropy measures the degree to which the
probability of the system is spread out over
different possible microstates. In contrast
to the macrostate, which characterizes plainly
observable average quantities, a microstate
specifies all molecular details about the
system including the position and velocity
of every molecule. The more such states available
to the system with appreciable probability,
the greater the entropy. In statistical mechanics,
entropy is a measure of the number of ways
in which a system may be arranged, often taken
to be a measure of "disorder". This definition
describes the entropy as being proportional
to the natural logarithm of the number of
possible microscopic configurations of the
individual atoms and molecules of the system
which could give rise to the observed macroscopic
state of the system. The constant of proportionality
is the Boltzmann constant.
Specifically, entropy is a logarithmic measure
of the number of states with significant probability
of being occupied:
where kB is the Boltzmann constant, equal
to 1.38065×10−23 J K−1. The summation
is over all the possible microstates of the
system, and pi is the probability that the
system is in the ith microstate. This definition
assumes that the basis set of states has been
picked so that there is no information on
their relative phases. In a different basis
set, the more general expression is
where is the density matrix and is the matrix
logarithm. This density matrix formulation
is not needed in cases of thermal equilibrium
so long as the basis states are chosen to
be energy eigenstates. For most practical
purposes, this can be taken as the fundamental
definition of entropy since all other formulas
for S can be mathematically derived from it,
but not vice versa.
In what has been called the fundamental assumption
of statistical thermodynamics or the fundamental
postulate in statistical mechanics, the occupation
of any microstate is assumed to be equally
probable; this assumption is usually justified
for an isolated system in equilibrium. Then
the previous equation reduces to:
In thermodynamics, such a system is one in
which the volume, number of molecules, and
internal energy are fixed.
The most general interpretation of entropy
is as a measure of our uncertainty about a
system. The equilibrium state of a system
maximizes the entropy because we have lost
all information about the initial conditions
except for the conserved variables; maximizing
the entropy maximizes our ignorance about
the details of the system. This uncertainty
is not of the everyday subjective kind, but
rather the uncertainty inherent to the experimental
method and interpretative model.
The interpretative model has a central role
in determining entropy. The qualifier "for
a given set of macroscopic variables" above
has deep implications: if two observers use
different sets of macroscopic variables, they
will observe different entropies. For example,
if observer A uses the variables U, V and
W, and observer B uses U, V, W, X, then, by
changing X, observer B can cause an effect
that looks like a violation of the second
law of thermodynamics to observer A. In other
words: the set of macroscopic variables one
chooses must include everything that may change
in the experiment, otherwise one might see
decreasing entropy!
Entropy can be defined for any Markov processes
with reversible dynamics and the detailed
balance property.
In Boltzmann's 1896 Lectures on Gas Theory,
he showed that this expression gives a measure
of entropy for systems of atoms and molecules
in the gas phase, thus providing a measure
for the entropy of classical thermodynamics.
Entropy of a system
Entropy is the above-mentioned unexpected
and, to some, obscure integral that arises
directly from the Carnot cycle. It is reversible
heat divided by temperature. It is, remarkably,
a function of state and it is fundamental
and very useful.
In a thermodynamic system, pressure, density,
and temperature tend to become uniform over
time because this equilibrium state has higher
probability than any other; see statistical
mechanics. As an example, for a glass of ice
water in air at room temperature, the difference
in temperature between a warm room and cold
glass of ice and water, begins to be equalized
as portions of the thermal energy from the
warm surroundings spread to the cooler system
of ice and water. Over time the temperature
of the glass and its contents and the temperature
of the room become equal. The entropy of the
room has decreased as some of its energy has
been dispersed to the ice and water. However,
as calculated in the example, the entropy
of the system of ice and water has increased
more than the entropy of the surrounding room
has decreased. In an isolated system such
as the room and ice water taken together,
the dispersal of energy from warmer to cooler
always results in a net increase in entropy.
Thus, when the "universe" of the room and
ice water system has reached a temperature
equilibrium, the entropy change from the initial
state is at a maximum. The entropy of the
thermodynamic system is a measure of how far
the equalization has progressed.
Thermodynamic entropy is a non-conserved state
function that is of great importance in the
sciences of physics and chemistry. Historically,
the concept of entropy evolved in order to
explain why some processes occur spontaneously
while their time reversals do not; systems
tend to progress in the direction of increasing
entropy. For isolated systems, entropy never
decreases. This fact has several important
consequences in science: first, it prohibits
"perpetual motion" machines; and second, it
implies the arrow of entropy has the same
direction as the arrow of time. Increases
in entropy correspond to irreversible changes
in a system, because some energy is expended
as waste heat, limiting the amount of work
a system can do.
Unlike many other functions of state, entropy
cannot be directly observed but must be calculated.
Entropy can be calculated for a substance
as the standard molar entropy from absolute
zero or as a difference in entropy from some
other reference state which is defined as
zero entropy. Entropy has the dimension of
energy divided by temperature, which has a
unit of joules per kelvin in the International
System of Units. While these are the same
units as heat capacity, the two concepts are
distinct. Entropy is not a conserved quantity:
for example, in an isolated system with non-uniform
temperature, heat might irreversibly flow
and the temperature become more uniform such
that entropy increases. The second law of
thermodynamics, states that a closed system
has entropy which may increase or otherwise
remain constant. Chemical reactions cause
changes in entropy and entropy plays an important
role in determining in which direction a chemical
reaction spontaneously proceeds.
One dictionary definition of entropy is that
it is "a measure of thermal energy per unit
temperature that is not available for useful
work". For instance, a substance at uniform
temperature is at maximum entropy and cannot
drive a heat engine. A substance at non-uniform
temperature is at a lower entropy and some
of the thermal energy can drive a heat engine.
A special case of entropy increase, the entropy
of mixing, occurs when two or more different
substances are mixed. If the substances are
at the same temperature and pressure, there
will be no net exchange of heat or work – the
entropy change will be entirely due to the
mixing of the different substances. At a statistical
mechanical level, this results due to the
change in available volume per particle with
mixing.
Second law of thermodynamics
The second law of thermodynamics states that
in general the total entropy of any system
will not decrease other than by increasing
the entropy of some other system. Hence, in
a system isolated from its environment, the
entropy of that system will tend not to decrease.
It follows that heat will not flow from a
colder body to a hotter body without the application
of work to the colder body. Secondly, it is
impossible for any device operating on a cycle
to produce net work from a single temperature
reservoir; the production of net work requires
flow of heat from a hotter reservoir to a
colder reservoir, or a single expanding reservoir
undergoing adiabatic cooling, which performs
adiabatic work. As a result, there is no possibility
of a perpetual motion system. It follows that
a reduction in the increase of entropy in
a specified process, such as a chemical reaction,
means that it is energetically more efficient.
It follows from the second law of thermodynamics
that the entropy of a system that is not isolated
may decrease. An air conditioner, for example,
may cool the air in a room, thus reducing
the entropy of the air of that system. The
heat expelled from the room, which the air
conditioner transports and discharges to the
outside air, will always make a bigger contribution
to the entropy of the environment than will
the decrease of the entropy of the air of
that system. Thus, the total of entropy of
the room plus the entropy of the environment
increases, in agreement with the second law
of thermodynamics.
In mechanics, the second law in conjunction
with the fundamental thermodynamic relation
places limits on a system's ability to do
useful work. The entropy change of a system
at temperature T absorbing an infinitesimal
amount of heat δq in a reversible way, is
given by δq/T. More explicitly, an energy
TR S is not available to do useful work,
where TR is the temperature of the coldest
accessible reservoir or heat sink external
to the system. For further discussion, see
Exergy.
Statistical mechanics demonstrates that entropy
is governed by probability, thus allowing
for a decrease in disorder even in an isolated
system. Although this is possible, such an
event has a small probability of occurring,
making it unlikely.
Applications
The fundamental thermodynamic relation
The entropy of a system depends on its internal
energy and the external parameters, such as
the volume. In the thermodynamic limit this
fact leads to an equation relating the change
in the internal energy to changes in the entropy
and the external parameters. This relation
is known as the fundamental thermodynamic
relation. If the volume is the only external
parameter, this relation is:
Since the internal energy is fixed when one
specifies the entropy and the volume, this
relation is valid even if the change from
one state of thermal equilibrium to another
with infinitesimally larger entropy and volume
happens in a non-quasistatic way.
The fundamental thermodynamic relation implies
many thermodynamic identities that are valid
in general, independent of the microscopic
details of the system. Important examples
are the Maxwell relations and the relations
between heat capacities.
Entropy in chemical thermodynamics
Thermodynamic entropy is central in chemical
thermodynamics, enabling changes to be quantified
and the outcome of reactions predicted. The
second law of thermodynamics states that entropy
in an isolated system – the combination
of a subsystem under study and its surroundings
– increases during all spontaneous chemical
and physical processes. The Clausius equation
of δqrev/T = ΔS introduces the measurement
of entropy change, ΔS. Entropy change describes
the direction and quantifies the magnitude
of simple changes such as heat transfer between
systems – always from hotter to cooler spontaneously.
The thermodynamic entropy therefore has the
dimension of energy divided by temperature,
and the unit joule per kelvin in the International
System of Units.
Thermodynamic entropy is an extensive property,
meaning that it scales with the size or extent
of a system. In many processes it is useful
to specify the entropy as an intensive property
independent of the size, as a specific entropy
characteristic of the type of system studied.
Specific entropy may be expressed relative
to a unit of mass, typically the kilogram.
Alternatively, in chemistry, it is also referred
to one mole of substance, in which case it
is called the molar entropy with a unit of
Jmol−1K−1.
Thus, when one mole of substance at about
0K is warmed by its surroundings to 298K,
the sum of the incremental values of qrev/T
constitute each element's or compound's standard
molar entropy, an indicator of the amount
of energy stored by a substance at 298K. Entropy
change also measures the mixing of substances
as a summation of their relative quantities
in the final mixture.
Entropy is equally essential in predicting
the extent and direction of complex chemical
reactions. For such applications, ΔS must
be incorporated in an expression that includes
both the system and its surroundings, ΔSuniverse
= ΔSsurroundings + ΔS system. This expression
becomes, via some steps, the Gibbs free energy
equation for reactants and products in the
system: ΔG [the Gibbs free energy change
of the system] = ΔH [the enthalpy change]
−T ΔS [the entropy change].
Entropy balance equation for open systems
In chemical engineering, the principles of
thermodynamics are commonly applied to "open
systems", i.e. those in which heat, work,
and mass flow across the system boundary.
Flows of both heat and work, i.e. and P(dV/dt),
across the system boundaries, in general cause
changes in the entropy of the system. Transfer
as heat entails entropy transfer where T is
the absolute thermodynamic temperature of
the system at the point of the heat flow.
If there are mass flows across the system
boundaries, they will also influence the total
entropy of the system. This account, in terms
of heat and work, is valid only for cases
in which the work and heat transfers are by
paths physically distinct from the paths of
entry and exit of matter from the system.
To derive a generalized entropy balanced equation,
we start with the general balance equation
for the change in any extensive quantity Θ
in a thermodynamic system, a quantity that
may be either conserved, such as energy, or
non-conserved, such as entropy. The basic
generic balance expression states that dΘ/dt,
i.e. the rate of change of Θ in the system,
equals the rate at which Θ enters the system
at the boundaries, minus the rate at which
Θ leaves the system across the system boundaries,
plus the rate at which Θ is generated within
the system. For an open thermodynamic system
in which heat and work are transferred by
paths separate from the paths for transfer
of matter, using this generic balance equation,
with respect to the rate of change with time
of the extensive quantity entropy S, the entropy
balance equation is:
where
= the net rate of entropy flow due to the
flows of mass into and out of the system.
= the rate of entropy flow due to the flow
of heat across the system boundary.
= the rate of entropy production within the
system. This entropy production arises from
processes within the system, including chemical
reactions, internal matter diffusion, internal
heat transfer, and frictional effects such
as viscosity occurring within the system from
mechanical work transfer to or from the system.
Note, also, that if there are multiple heat
flows, the term will be replaced by where
is the heat flow and is the temperature at
the jth heat flow port into the system.
Entropy and other forms of energy beyond work
The fundamental equation of thermodynamics
for a system containing n constituent species,
with the i-th species having Ni particles
is, with additional terms:
U is internal energy, T is temperature, P
is pressure, V is volume, and Ni are the chemical
potential and number of molecules of the chemical,
and Q are electric potential and charge, v
and p are velocity and momentum.
Solving for the change in entropy we get:
There is a minuscule change in internal energy
for any change in entropy. But in theory,
the entropy of a system can be changed without
changing its energy. That is done by keeping
all variables constant, including temperature
and entropy. That is easy to see, but typically
the energy of the system will change. e.g.
You can attempt to keep volume constant but
you will always do work on the system, and
work changes the energy. Other potentials
such as the gravitational potential can also
be taken into account.
Entropy change formulas for simple processes
For certain simple transformations in systems
of constant composition, the entropy changes
are given by simple formulas.
Isothermal expansion or compression of an
ideal gas
For the expansion of an ideal gas from an
initial volume and pressure to a final volume
and pressure at any constant temperature,
the change in entropy is given by:
Here is the number of moles of gas and is
the ideal gas constant. These equations also
apply for expansion into a finite vacuum or
a throttling process, where the temperature,
internal energy and enthalpy for an ideal
gas remain constant.
Cooling and heating
For heating of any system at constant pressure
from an initial temperature to a final temperature
, the entropy change is
.
provided that the constant-pressure molar
heat capacity CP is constant and that no phase
transition occurs in this temperature interval.
Similarly at constant volume, the entropy
change is
,
where the constant-volume heat capacity Cv
is constant and there is no phase change.
At low temperatures near absolute zero, heat
capacities of solids quickly drop off to near
zero, so the assumption of constant heat capacity
does not apply.
Since entropy is a state function, the entropy
change of any process in which temperature
and volume both vary is the same as for a
path divided into two steps - heating at constant
volume and expansion at constant temperature.
For an ideal gas, the total entropy change
is
.
Similarly if the temperature and pressure
of an ideal gas both vary,
.
Phase transitions
Reversible phase transitions occur at constant
temperature and pressure. The reversible heat
is the enthalpy change for the transition,
and the entropy change is the enthalpy change
divided by the thermodynamic temperature.
For fusion of a solid to a liquid at the melting
point Tm, the entropy of fusion is
.
Similarly for vaporization of a liquid to
a gas at the boiling point Tb, the entropy
of vaporization is
.
Approaches to understanding entropy
As a fundamental aspect of thermodynamics
and physics, several different approaches
to entropy beyond that of Clausius and Boltzmann
are valid.
Standard textbook definitions
The following is a list of additional definitions
of entropy from a collection of textbooks:
a measure of energy dispersal at a specific
temperature.
a measure of disorder in the universe or of
the availability of the energy in a system
to do work.
a measure of a system's thermal energy per
unit temperature that is unavailable for doing
useful work.
In Boltzmann's definition, entropy is a measure
of the number of possible microscopic states
of a system in thermodynamic equilibrium.
Consistent with the Boltzmann definition,
the second law of thermodynamics needs to
be re-worded as such that entropy increases
over time, though the underlying principle
remains the same.
Order and disorder
Entropy has often been loosely associated
with the amount of order, disorder, and/or
chaos in a thermodynamic system. The traditional
qualitative description of entropy is that
it refers to changes in the status quo of
the system and is a measure of "molecular
disorder" and the amount of wasted energy
in a dynamical energy transformation from
one state or form to another. In this direction,
several recent authors have derived exact
entropy formulas to account for and measure
disorder and order in atomic and molecular
assemblies. One of the simpler entropy order/disorder
formulas is that derived in 1984 by thermodynamic
physicist Peter Landsberg, based on a combination
of thermodynamics and information theory arguments.
He argues that when constraints operate on
a system, such that it is prevented from entering
one or more of its possible or permitted states,
as contrasted with its forbidden states, the
measure of the total amount of "disorder"
in the system is given by:
Similarly, the total amount of "order" in
the system is given by:
In which CD is the "disorder" capacity of
the system, which is the entropy of the parts
contained in the permitted ensemble, CI is
the "information" capacity of the system,
an expression similar to Shannon's channel
capacity, and CO is the "order" capacity of
the system.
Energy dispersal
The concept of entropy can be described qualitatively
as a measure of energy dispersal at a specific
temperature. Similar terms have been in use
from early in the history of classical thermodynamics,
and with the development of statistical thermodynamics
and quantum theory, entropy changes have been
described in terms of the mixing or "spreading"
of the total energy of each constituent of
a system over its particular quantized energy
levels.
Ambiguities in the terms disorder and chaos,
which usually have meanings directly opposed
to equilibrium, contribute to widespread confusion
and hamper comprehension of entropy for most
students. As the second law of thermodynamics
shows, in an isolated system internal portions
at different temperatures will tend to adjust
to a single uniform temperature and thus produce
equilibrium. A recently developed educational
approach avoids ambiguous terms and describes
such spreading out of energy as dispersal,
which leads to loss of the differentials required
for work even though the total energy remains
constant in accordance with the first law
of thermodynamics. Physical chemist Peter
Atkins, for example, who previously wrote
of dispersal leading to a disordered state,
now writes that "spontaneous changes are always
accompanied by a dispersal of energy".
Relating entropy to energy usefulness
Following on from the above, it is possible
to regard entropy as an indicator or measure
of the effectiveness or usefulness of a particular
quantity of energy. This is because energy
supplied at a high temperature tends to be
more useful than the same amount of energy
available at room temperature. Mixing a hot
parcel of a fluid with a cold one produces
a parcel of intermediate temperature, in which
the overall increase in entropy represents
a "loss" which can never be replaced.
Thus, the fact that the entropy of the universe
is steadily increasing, means that its total
energy is becoming less useful: eventually,
this will lead to the "heat death of the Universe".
Entropy and adiabatic accessibility
A definition of entropy based entirely on
the relation of adiabatic accessibility between
equilibrium states was given by E.H.Lieb and
J. Yngvason in 1999. This approach has several
predecessors, including the pioneering work
of Constantin Carathéodory from 1909 and
the monograph by R. Giles from 1964. In the
setting of Lieb and Yngvason one starts by
picking, for a unit amount of the substance
under consideration, two reference states
and such that the latter is adiabatically
accessible from the former but not vice versa.
Defining the entropies of the reference states
to be 0 and 1 respectively the entropy of
a state is defined as the largest number such
that is adiabatically accessible from a composite
state consisting of an amount in the state
and a complementary amount, , in the state
. A simple but important result within this
setting is that entropy is uniquely determined,
apart from a choice of unit and an additive
constant for each chemical element, by the
following properties: It is monotonic with
respect to the relation of adiabatic accessibility,
additive on composite systems, and extensive
under scaling.
Entropy in quantum mechanics
In quantum statistical mechanics, the concept
of entropy was developed by John von Neumann
and is generally referred to as "von Neumann
entropy",
where ρ is the density matrix and Tr is the
trace operator.
This upholds the correspondence principle,
because in the classical limit, when the phases
between the basis states used for the classical
probabilities are purely random, this expression
is equivalent to the familiar classical definition
of entropy,
,
i.e. in such a basis the density matrix is
diagonal.
Von Neumann established a rigorous mathematical
framework for quantum mechanics with his work
Mathematische Grundlagen der Quantenmechanik.
He provided in this work a theory of measurement,
where the usual notion of wave function collapse
is described as an irreversible process. Using
this concept, in conjunction with the density
matrix he extended the classical concept of
entropy into the quantum domain.
Information theory
When viewed in terms of information theory,
the entropy state function is simply the amount
of information that would be needed to specify
the full microstate of the system. This is
left unspecified by the macroscopic description.
In information theory, entropy is the measure
of the amount of information that is missing
before reception and is sometimes referred
to as Shannon entropy. Shannon entropy is
a broad and general concept which finds applications
in information theory as well as thermodynamics.
It was originally devised by Claude Shannon
in 1948 to study the amount of information
in a transmitted message. The definition of
the information entropy is, however, quite
general, and is expressed in terms of a discrete
set of probabilities pi so that
In the case of transmitted messages, these
probabilities were the probabilities that
a particular message was actually transmitted,
and the entropy of the message system was
a measure of the average amount of information
in a message. For the case of equal probabilities,
the Shannon entropy is just the number of
yes/no questions needed to determine the content
of the message.
The question of the link between information
entropy and thermodynamic entropy is a debated
topic. While most authors argue that there
is a link between the two, a few argue that
they have nothing to do with each other.
The expressions for the two entropies are
similar. The information entropy H for equal
probabilities pi = p = 1/n is
where k is a constant which determines the
units of entropy. There are many ways of demonstrating
the equivalence of "information entropy" and
"physics entropy", that is, the equivalence
of "Shannon entropy" and "Boltzmann entropy".
Nevertheless, some authors argue for dropping
the word entropy for the H function of information
theory and using Shannon's other term "uncertainty"
instead.
Interdisciplinary applications of entropy
Although the concept of entropy was originally
a thermodynamic construct, it has been adapted
in other fields of study, including information
theory, psychodynamics, thermoeconomics/ecological
economics, and evolution.
Thermodynamic and statistical mechanics concepts
Entropy unit – a non-S.I. unit of thermodynamic
entropy, usually denoted "e.u." and equal
to one calorie per Kelvin per mole, or 4.184
Joules per Kelvin per mole.
Gibbs entropy – the usual statistical mechanical
entropy of a thermodynamic system.
Boltzmann entropy – a type of Gibbs entropy,
which neglects internal statistical correlations
in the overall particle distribution.
Tsallis entropy – a generalization of the
standard Boltzmann-Gibbs entropy.
Standard molar entropy – is the entropy
content of one mole of substance, under conditions
of standard temperature and pressure.
Residual entropy – the entropy present after
a substance is cooled arbitrarily close to
absolute zero.
Entropy of mixing – the change in the entropy
when two different chemical substances or
components are mixed.
Loop entropy – is the entropy lost upon
bringing together two residues of a polymer
within a prescribed distance.
Conformational entropy – is the entropy
associated with the physical arrangement of
a polymer chain that assumes a compact or
globular state in solution.
Entropic force – a microscopic force or
reaction tendency related to system organization
changes, molecular frictional considerations,
and statistical variations.
Free entropy – an entropic thermodynamic
potential analogous to the free energy.
Entropic explosion – an explosion in which
the reactants undergo a large change in volume
without releasing a large amount of heat.
Entropy change – a change in entropy dS
between two equilibrium states is given by
the heat transferred dQrev divided by the
absolute temperature T of the system in this
interval.
Sackur-Tetrode entropy – the entropy of
a monatomic classical ideal gas determined
via quantum considerations.
The arrow of time
Entropy is the only quantity in the physical
sciences that seems to imply a particular
direction of progress, sometimes called an
arrow of time. As time progresses, the second
law of thermodynamics states that the entropy
of an isolated system never decreases. Hence,
from this perspective, entropy measurement
is thought of as a kind of clock.
Cosmology
Since a finite universe is an isolated system,
the Second Law of Thermodynamics states that
its total entropy is constantly increasing.
It has been speculated, since the 19th century,
that the universe is fated to a heat death
in which all the energy ends up as a homogeneous
distribution of thermal energy, so that no
more work can be extracted from any source.
If the universe can be considered to have
generally increasing entropy, then – as
Sir Roger Penrose has pointed out – gravity
plays an important role in the increase because
gravity causes dispersed matter to accumulate
into stars, which collapse eventually into
black holes. The entropy of a black hole is
proportional to the surface area of the black
hole's event horizon. Jacob Bekenstein and
Stephen Hawking have shown that black holes
have the maximum possible entropy of any object
of equal size. This makes them likely end
points of all entropy-increasing processes,
if they are totally effective matter and energy
traps. Hawking has, however, recently changed
his stance on this aspect, in a paper which
largely redefined the event horizons of black
holes and suggested that the escape of energy
from black holes might be possible due to
quantum activity.
The role of entropy in cosmology remains a
controversial subject since the time of Ludwig
Bolzmann. Recent work has cast some doubt
on the heat death hypothesis and the applicability
of any simple thermodynamic model to the universe
in general. Although entropy does increase
in the model of an expanding universe, the
maximum possible entropy rises much more rapidly,
moving the universe further from the heat
death with time, not closer. This results
in an "entropy gap" pushing the system further
away from the posited heat death equilibrium.
Other complicating factors, such as the energy
density of the vacuum and macroscopic quantum
effects, are difficult to reconcile with thermodynamical
models, making any predictions of large-scale
thermodynamics extremely difficult.
The entropy gap is widely believed to have
been originally opened up by the early rapid
exponential expansion of the universe.
See also
Notes
References
Further reading
Atkins, Peter; Julio De Paula. Physical Chemistry,
8th ed. Oxford University Press. ISBN 0-19-870072-5. 
Baierlein, Ralph. Thermal Physics. Cambridge
University Press. ISBN 0-521-65838-1. 
Ben-Naim, Arieh. Entropy Demystified. World
Scientific. ISBN 981-270-055-2. 
Callen, Herbert, B. Thermodynamics and an
Introduction to Thermostatistics, 2nd Ed.
John Wiley and Sons. ISBN 0-471-86256-8. 
Chang, Raymond. Chemistry, 6th Ed. New York:
McGraw Hill. ISBN 0-07-115221-0. 
Cutnell, John, D.; Johnson, Kenneth, J.. Physics,
4th ed. John Wiley and Sons, Inc. ISBN 0-471-19113-2. 
Dugdale, J. S.. Entropy and its Physical Meaning.
Taylor and Francis; CRC. ISBN 0-7484-0569-0. 
Fermi, Enrico. Thermodynamics. Prentice Hall.
ISBN 0-486-60361-X. 
Goldstein, Martin; Inge, F. The Refrigerator
and the Universe. Harvard University Press.
ISBN 0-674-75325-9. 
Gyftopoulos, E.P.; G.P. Beretta. Thermodynamics.
Foundations and Applications. Dover. ISBN 0-486-43932-1. 
Haddad, Wassim M.; Chellaboina, VijaySekhar;
Nersesov, Sergey G.. Thermodynamics – 
A Dynamical Systems Approach. Princeton University
Press. ISBN 0-691-12327-6. 
Kroemer, Herbert; Charles Kittel. Thermal
Physics. W. H. Freeman Company. ISBN 0-7167-1088-9. 
Lambert, Frank L.; entropysite.oxy.edu
Penrose, Roger. The Road to Reality: A Complete
Guide to the Laws of the Universe. New York:
A. A. Knopf. ISBN 0-679-45443-8. 
Reif, F.. Fundamentals of statistical 
and thermal physics. McGraw-Hill. ISBN 0-07-051800-9. 
Schroeder, Daniel V.. Introduction to Thermal
Physics. New York: Addison Wesley Longman.
ISBN 0-201-38027-7. 
Serway, Raymond, A.. Physics for Scientists
and Engineers. Saunders Golden Subburst Series.
ISBN 0-03-096026-6. 
Spirax-Sarco Limited, Entropy – A Basic
Understanding A primer on entropy tables for
steam engineering
vonBaeyer; Hans Christian. Maxwell's Demon:
Why Warmth Disperses and Time Passes. Random
House. ISBN 0-679-43342-2. 
Entropy for beginners – a wikibook
An Intuitive Guide to the Concept of Entropy
Arising in Various Sectors of Science – a
wikibook
External links
Entropy and the Second Law of Thermodynamics
- an A-level physics lecture with detailed
derivation of entropy based on Carnot cycle
Khan Academy: entropy lectures, part of Chemistry
playlist
Proof: S is a valid state variable
Thermodynamic Entropy Definition Clarification
Reconciling Thermodynamic and State Definitions
of Entropy
Entropy Intuition
More on Entropy
The Second Law of Thermodynamics and Entropy
- Yale OYC lecture, part of Fundamentals of
Physics I
Entropy and the Clausius inequality MIT OCW
lecture, part of 5.60 Thermodynamics & Kinetics,
Spring 2008
The Discovery of Entropy by Adam Shulman.
Hour-long video, January 2013.
Moriarty, Philip; Merrifield, Michael. "S
Entropy". Sixty Symbols. Brady Haran for the
University of Nottingham. 
