Introduction to entropy


Entropy is an important concept in the branch of physics known as thermodynamics. It is a variable that describes the state of a system made of smaller components. Entropy is often used to describe a volume of matter composed of many molecules, but it can also be applied to a digital message composed of bits, or even the cattle on a ranch or a room full of people.
Entropy can be described as the number of possible configurations of a system's components that is consistent with the state of the system as a whole. Consider a billiards table with 15 balls on it. If we broadly observe that the balls are all lined up along one edge of the table, there is a certain finite number of combinations of the individual balls' locations that would be consistent with that broad observation of the balls being lined up in such a manner. However, if we broadly observe that the balls are spread out across the table, then there is a much higher number of combinations of the individual balls' locations that would be consistent with the broad observation that the balls are spread out. We say that the spread-out arrangement has high entropy, compared to the lined-up arrangement's low entropy.
The idea of "irreversibility" is central to the understanding of entropy. If we watch a movie of billiard balls moving around, it might be easy to distinguish between the movie running forward versus in running in reverse. If we see that the balls start in the lined-up arrangement and move toward being spread out, this intuitively looks like developments we see in everyday life: a puff of smoke going from a high concentration to diffusing into the air, or a car crashing and disintegrating into many pieces spread out on the road. However, if we see that the balls start out in the random arrangement and then move toward being aligned along one edge, something looks wrong. We know the movie must be running in reverse, because, like smoke coming out of the air and concentrating itself, or pieces of metal coming together to form an operating car, it's extremely unlikely for spread-out billiard balls to spontaneously line up along one edge of the table. Therefore, the movement of the balls from a low-entropy arrangement to a high-entropy arrangement is described as an irreversible process.
All real physical processes involving systems in everyday life, with many atoms or molecules, are irreversible. For an irreversible process in an isolated system, the variable known as entropy never decreases. The statement that the entropy of an isolated system never decreases is known as the second law of thermodynamics.
Classical thermodynamics is a physical theory which describes a "system" in terms of the thermodynamic variables of the system or its parts. Some thermodynamic variables are familiar: temperature, pressure, volume. Entropy is a thermodynamic variable which is less familiar and not as easily understood. A "system" is any region of space containing matter and energy: A cup of coffee, a glass of ice water, an automobile, an egg. Thermodynamic variables do not give a "complete" picture of the system. Thermodynamics makes no assumptions about the microscopic nature of a system and does not describe nor does it take into account the positions and velocities of the individual atoms and molecules which make up the system. Thermodynamics deals with matter in a macroscopic sense; it would be valid even if the atomic theory of matter were wrong. This is an important quality, because it means that reasoning based on thermodynamics is unlikely to require alteration as new facts about atomic structure and atomic interactions are found. The essence of thermodynamics is embodied in the four laws of thermodynamics.
Unfortunately, thermodynamics provides little insight into what is happening at a microscopic level. Statistical mechanics is a physical theory which explains thermodynamics in microscopic terms. It explains thermodynamics in terms of the possible detailed microscopic situations the system may be in when the thermodynamic variables of the system are known. These are known as "microstates" whereas the description of the system in thermodynamic terms specifies the "macrostate" of the system. Many different microstates can yield the same macrostate. It is important to understand that statistical mechanics does not define temperature, pressure, entropy, etc. They are already defined by thermodynamics. Statistical mechanics serves to explain thermodynamics in terms of microscopic behavior of the atoms and molecules in the system.
In statistical mechanics, the entropy of a system is described as a measure of how many different microstates there are that could give rise to the macrostate that the system is in. The entropy of the system is given by Ludwig Boltzmann's famous equation:
where is the entropy of the macrostate, is Boltzmann's constant, and is the total number of possible microstates that might yield the macrostate. The concept of irreversibility stems from the idea that if you have a system in an "unlikely" macrostate it will soon move to the "most likely" macrostate and the entropy S will increase. A glass of warm water with an ice cube in it is unlikely to just happen, it must have been recently created, and the system will move to a more likely macrostate in which the ice cube is partially or entirely melted and the water is cooled. Statistical mechanics shows that the number of microstates which give ice and warm water is much smaller than the number of microstates that give the reduced ice mass and cooler water.

Explanation

The concept of thermodynamic entropy arises from the second law of thermodynamics. This law of entropy increase quantifies the reduction in the capacity of a system for change or determines whether a thermodynamic process may occur. For example, heat always flows from a region of higher temperature to one with lower temperature until temperature becomes uniform.
Entropy is calculated in two ways. The first, that applies where only heat changes cause a change in entropy, is the entropy change to a system containing a sub-system which undergoes heat transfer to its surroundings. It is based on the macroscopic relationship between heat flow into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system. The second calculates the absolute entropy of a system based on the microscopic behaviour of its individual particles. This is based on the natural logarithm of the number of microstates possible in a particular macrostate called the thermodynamic probability. Roughly, it gives the probability of the system's being in that state. In this sense it effectively defines entropy independently from its effects due to changes which may involve heat, mechanical, electrical, chemical energies etc. but also includes logical states such as information.
Following the formalism of Clausius, the first calculation can be mathematically stated as:
Where is the increase or decrease in entropy, is the heat added to the system or subtracted from it, and is temperature. The equal sign indicates that the change is reversible, because Clausius shows a proportional relationship between entropy and the energy flow, in a system, the heat energy can be transformed into work, and work can be transformed into heat through a cyclical process. If the temperature is allowed to vary, the equation must be integrated over the temperature path. This calculation of entropy change does not allow the determination of absolute value, only differences. In this context, the Second Law of Thermodynamics may be stated that for heat transferred over any valid process for any system, whether isolated or not,
According to the first law of thermodynamics, which deals with the conservation of energy, the loss of heat will result in a decrease in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of decrease in internal energy and the corresponding increase in internal energy of the surroundings at a given temperature. A simple and more concrete visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.
The second calculation defines entropy in absolute terms and comes from statistical mechanics. The entropy of a particular macrostate is defined to be Boltzmann's constant times the natural logarithm of the number of microstates corresponding to that macrostate, or mathematically
where the variables are defined as before.
The macrostate of a system is what we know about the system, for example the temperature, pressure, and volume of a gas in a box. For each set of values of temperature, pressure, and volume there are many arrangements of molecules which result in those values. The number of arrangements of molecules which could result in the same values for temperature, pressure and volume is the number of microstates.
The concept of entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. Information entropy takes the mathematical concepts of statistical thermodynamics into areas of probability theory unconnected with heat and energy.

Example of increasing entropy

Ice melting provides an example in which entropy increases in a small system, a thermodynamic system consisting of the surroundings and the entity of glass container, ice and water which has been allowed to reach thermodynamic equilibrium at the melting temperature of ice. In this system, some heat from the warmer surroundings at 298 K transfers to the cooler system of ice and water at its constant temperature of 273 K, the melting temperature of ice. The entropy of the system, which is, increases by. The heat δQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. ΔH for ice fusion.
It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio,, of for the surroundings is smaller than the ratio, of for the ice and water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.
As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the over the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice and water was introduced and became a 'system' within it.

Origins and uses

Originally, entropy was named to describe the "waste heat," or more accurately, energy loss, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work. Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level. In the late 19th century, the word "disorder" was used by Ludwig Boltzmann in developing statistical views of entropy using probability theory to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by Werner Heisenberg and those who followed. Descriptions of thermodynamic entropy on the microscopic level are found in statistical thermodynamics and statistical mechanics.
For most of the 20th century, textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the "motional" energy of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe entropy as energy dispersal. Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.
The mathematics developed in statistical thermodynamics were found to be applicable in other disciplines. In particular, information sciences developed the concept of information entropy, which lacks the Boltzmann constant inherent in thermodynamic entropy.

Heat and entropy

At a microscopic level, kinetic energy of molecules is responsible for the temperature of a substance or a system. “Heat” is the kinetic energy of molecules being transferred: when motional energy is transferred from hotter surroundings to a cooler system, faster-moving molecules in the surroundings collide with the walls of the system which transfers some of their energy to the molecules of the system and makes them move faster.
The important overall principle is that ”Energy of all types changes from being localized to becoming dispersed or spread out, if not hindered from doing so. Entropy is the quantitative measure of that kind of a spontaneous process: how much energy has been transferred/T or how widely it has become spread out at a specific temperature."

Classical calculation of entropy

When entropy was first defined and used in 1865 the very existence of atoms was still controversial and there was no concept that temperature was due to the motional energy of molecules or that “heat” was actually the transferring of that motional molecular energy from one place to another. Entropy change,, was described in macroscopic terms that could be directly measured, such as volume, temperature, or pressure. However, today the classical equation of entropy, can be explained, part by part, in modern terms describing how molecules are responsible for what is happening: