Entropy and the 2nd & 3rd Laws of Thermodynamics
|Spontaneous Chemical Reactions||Entropy as a Measure of Disorder||Entropy and the Second Law of Thermodynamics|
|The Third Law of Thermodynamics||Standard-State Entropies of Reaction||Enthalpy of Reaction vs. Entropy of Reaction Calculations|
The first law of thermodynamics suggests that we can't get something for nothing. It allows us to build an apparatus that does work, but it places important restrictions on that apparatus. It says that we have to be willing to pay a price in terms of a loss of either heat or internal energy for any work we ask the system to do. It also puts a limit on the amount of work we can get for a given investment of either heat or internal energy.
The first law allows us to convert heat into work, or work into heat. It also allows us to change the internal energy of a system by transferring either heat or work between the system and its surroundings. But it doesn't tell us whether one of these changes is more easy to achieve than another. Our experiences, however, tell us that there is a preferred direction to many natural processes. We aren't surprised when a cup of coffee gradually loses heat to its surroundings as it cools, for example, or when the ice in a glass of lemonade absorbs heat as it melts. But we would be surprised if a cup of coffee suddenly grew hotter until it boiled or the water in a glass of lemonade froze on a hot summer day, even though neither process violates the first law of thermodynamics.
Similarly, we aren't surprised to see a piece of zinc metal dissolve in a strong acid to give bubbles of hydrogen gas.
Zn(s) + 2 H+(aq) Zn2+(aq) + H2(g)
But if we saw a film in which H2 bubbles formed on the surface of a solution and then sank through the solution until they disappeared, while a strip of zinc metal formed in the middle of the solution, we would conclude that the film was being run backward.
Many chemical and physical processes are reversible and yet tend to proceed in a direction in which they are said to be spontaneous. This raises an obvious question: What makes a reaction spontaneous? What drives the reaction in one direction and not the other?
So many spontaneous reactions are exothermic that it is tempting to assume that one of the driving forces that determines whether a reaction is spontaneous is a tendency to give off energy. The following are all examples of spontaneous chemical reactions that are exothermic.
|2 Al(s) + 3 Br2(l)||2 AlBr3(s)||Ho = -511 kJ/mol AlBr3|
|2 H2(g) + O2(g)||2 H2O(g)||Ho = -241.82 kJ/mol H2O|
|P4(s) + 5 O2(g)||P4O10(s)||Ho = -2984 kJ/mol P4O10|
There are also spontaneous reactions, however, that absorb energy from their surroundings. At 100oC, water boils spontaneously even though the reaction is endothermic.
|H2O(l) H2O(g)||Ho = 40.88 kJ/mol|
Ammonium nitrate dissolves spontaneously in water, even though energy is absorbed when this reaction takes place.
|NH4NO3(s)||NH4+(aq) + NO3-(aq)||Ho = 28.05 kJ/mol|
Thus, the tendency of a spontaneous reaction to give off energy can't be the only driving force behind a chemical reaction. There must be another factor that helps determine whether a reaction is spontaneous. This factor, known as entropy, is a measure of the disorder of the system.
Perhaps the best way to understand entropy as a driving force in nature is to conduct a simple experiment with a new deck of cards. Open the deck, remove the jokers, and then turn the deck so that you can read the cards. The top card will be the ace of spades, followed by the two, three, and four of spades, and so on. Now divide the cards in half, shuffle the deck, and note that the deck becomes more disordered. The more often the deck is shuffled, the more disordered it becomes.What makes a deck of cards become more disordered when shuffled?
In 1877 Ludwig Boltzmann provided a basis for answering this question when he introduced the concept of the entropy of a system as a measure of the amount of disorder in the system. A deck of cards fresh from the manufacturer is perfectly ordered and the entropy of this system is zero. When the deck is shuffled, the entropy of the system increases as the deck becomes more disordered.
There are 8.066 x 1067 different ways of organizing a deck of cards. The probability of obtaining any particular sequence of cards when the deck is shuffled is therefore 1 part in 8.066 x 1067. In theory, it is possible to shuffle a deck of cards until the cards fall into perfect order. But it isn't very likely!
Boltzmann proposed the following equation to describe the relationship between entropy and the amount of disorder in a system.
S = k ln W
In this equation, S is the entropy of the system, k is a proportionality constant equal to the ideal gas constant divided by Avogadro's constant, ln represents a logarithm to the base e, and W is the number of equivalent ways of describing the state of the system. According to this equation, the entropy of a system increases as the number of equivalent ways of describing the state of the system increases.
The relationship between the number of equivalent ways of describing a system and the amount of disorder in the system can be demonstrated with another analogy based on a deck of cards. There are 2,598,960 different hands that could be dealt in a game of five-card poker. More than half of these hands are essentially worthless. Winning hands are much rarer. Only 3,744 combinations correspond to a "full house," for example. The table below gives the number of equivalent combinations of cards for each category of poker hand, which is the value of W for this category. As the hand becomes more disordered, the value of W increases, and the hand becomes intrinsically less valuable.
Number of Equivalent Combinations for Various Types of Poker Hands
|Royal flush (AKQJ10 in one suit)||4||1.39|
|Straight flush (five cards in sequence in one suit)||36||3.58|
|Four of a kind||624||6.44|
|Full house (three of a kind plus a pair)||3,744||8.23|
|Flush (five cards in the same suit)||5,108||8.54|
|Straight (five cards in sequence)||10,200||9.23|
|Three of a kind||54,912||10.91|
The second law of thermodynamics describes the relationship between entropy and the spontaneity of natural processes.
Second Law: In an isolated system, natural processes are spontaneous when they lead to an increase in disorder, or entropy.
This statement is restricted to isolated systems to avoid having to worry about whether the reaction is exothermic or endothermic. By definition, neither heat nor work can be transferred between an isolated system and its surroundings.
We can apply the second law of thermodynamics to chemical reactions by noting that the entropy of a system is a state function that is directly proportional to the disorder of the system.
Ssys > 0 implies that the system becomes more disordered during the reaction. Ssys < 0 implies that the system becomes less disordered during the reaction.
For an isolated system, any process that leads to an increase in the disorder of the system will be spontaneous. The following generalizations can help us decide when a chemical reaction leads to an increase in the disorder of the system.
Solids have a much more regular structure than liquids. Liquids are therefore more disordered than solids.
The particles in a gas are in a state of constant, random motion. Gases are therefore more disordered than the corresponding liquids.
Any process that increases the number of particles in the system increases the amount of disorder.
|Practice Problem 2:
Which of the following processes will lead to an increase in the entropy of the system?
(a) N2(g) + 3H2 (g) 2 NH3(g)
(b) H2O(l) H2O(g)
(c) CaCO3(s) CaO(s) + CO2(g)
(d) NH4NO3(s) + H2O(l) NH4+ (aq) + NO3- (aq)
The sign of H for a chemical reaction affects the direction in which the reaction occurs.
Spontaneous reactions often, but not always, give off energy.
The sign of S for a reaction can also determine the direction of the reaction.
In an isolated system, chemical reactions occur in the direction that leads to an increase in the disorder of the system.
In order to decide whether a reaction is spontaneous, it is therefore important to consider the effect of changes in both enthalpy and entropy that occur during the reaction.
|Practice Problem 3:
Use the Lewis structures of NO2 and N2O4 and the stoichiometry of the following reaction to decide whether H and S favor the reactants or products of this reaction:
2 NO2(g) N2O4(g)
The third law of thermodynamics defines absolute zero on the entropy scale.
Third law: The entropy of a perfect crystal is zero when the temperature of the crystal is equal to absolute zero (0 K).
The crystal must be perfect, or else there will be some inherent disorder. It also must be at 0 K; otherwise there will be thermal motion within the crystal, which leads to disorder.
As the crystal warms to temperatures above 0 K, the particles in the crystal start to move, generating some disorder. The entropy of the crystal gradually increases with temperature as the average kinetic energy of the particles increases. At the melting point, the entropy of the system increases abruptly as the compound is transformed into a liquid, which is not as well ordered as the solid. The entropy of the liquid gradually increases as the liquid becomes warmer because of the increase in the vibrational, rotational, and translational motion of the particles. At the boiling point, there is another abrupt increase in the entropy of the substance as it is transformed into a random, chaotic gas.
The table below provides an example of the difference between the entropy of a substance in the solid, liquid, and gaseous phases.
The Entropy of Solid, Liquid, and Gaseous Forms of Sulfur Trioxide
Note that the units of entropy are joules per mole kelvin (J/mol-K). A plot of the entropy of this system versus temperature is shown in the figure below.
Because entropy is a state function, the change in the entropy of the system that accompanies any process can be calculated by subtracting the initial value of the entropy of the system from the final value.
S = Sf - Si
S for a chemical reaction is therefore equal to the difference between the sum of the entropies of the reactants and the products of the reaction.
S = S(products) - S(reactants)
When this difference is measured under standard-state conditions, the result is the standard-state entropy of reaction, So.
So = So(products) - So(reactants)
By convention, the standard state for thermodynamic measurements is characterized by the following conditions.
|All solutions have concentrations of 1 M.|
|All gases have partial pressures of 0.1 MPa (0.9869 atm)|
Although standard-state entropies can be measured at any temperature, they are often measured at 25oC.
|Practice Problem 4:
Calculate the standard-state entropy of reaction for the following reactions and explain the sign of S for each reaction.
(a) Hg(l) Hg(g)
(b) 2NO2(g) N2O4(g)
(c) N2(g) + O2(g) 2NO(g)
At first glance, tables of thermodynamic data seem inconsistent. Consider the data in the table below, for example.
Thermodynamic Data for Aluminum and Its Compounds
Substance HFo(kJ/mol) So (J/mol-K) Al(s) 0 28.33 Al(g) 326.4 164.54 Al2O3(s) -1675.7 50.92 AlCl3(s) -704.2 110.67
The enthalpy data in this table are given in terms of the standard-state enthalpy of formation of each substance, Hfo. This quantity is the heat given off or absorbed when the substance is made from its elements in their most thermodynamically stable state at 0.1 MPa. The enthalpy of formation of AlCl3, for example, is the heat given off in the following reaction.
2 Al(s) + 3 Cl2(g) 2 AlCl3(s) Hfo = -704.2 kJ/mol
The enthalpy data in this table are therefore relative numbers, which compare each compound with its elements.
Enthalpy data are listed as relative measurements because there is no absolute zero on the enthalpy scale. All we can measure is the heat given off or absorbed by a reaction. Thus, all we can determine is the difference between the enthalpies of the reactants and the products of a reaction. We therefore define the enthalpy of formation of the elements in their most thermodynamically stable states as zero and report all compounds as either more or less stable than their elements.
Entropy data are different. The third law defines absolute zero on the entropy scale. As a result, the absolute entropy of any element or compound can be measured by comparing it with a perfect crystal at absolute zero. The entropy data are therefore given as absolute numbers, So, not entropies of formation, Sof.
|AlCl3(s)||So = 110.67 J/mol-K|