Science
Disorder Metaphor Reading Reflection

Question Description

Write a reading reflection on the attached papers on entropy discussing the disorder metaphor. It should address one or more of the reading reflection prompts. Make sure it is focused and coherent (there is a a unifying theme to your reflection). The reading reflection should be around 250 words long.

Reading reflection prompts: what are the author's purpose in writing this article? What can you take from it that can be applied in your own classroom? How does the article impact your own physics content and pedagogical knowledge?

Unformatted Attachment Preview

In the Classroom Disorder—A Cracked Crutch for Supporting Entropy Discussions Frank L. Lambert† 2834 Lewis Dr., La Verne, CA 91750; flambert@att.net This article decries the use of “disorder” in teaching beginning students about thermodynamic entropy. It is cautionary rather than proscriptive about “disorder” being used warily as a device for assessing entropy change in advanced work or among professionals.1 Overview To help students visualize an increase in entropy, many elementary chemistry texts use artists’ before-and-after drawings of groups of “orderly” molecules that become “disorderly”. This has been an anachronism ever since the ideas of quantized energy levels were introduced in elementary chemistry. “Orderly– disorderly” seems to be an easy visual support, but it can be so grievously misleading as to be characterized as a failureprone crutch rather than a truly reliable, sturdy aid.2 After mentioning the origin of this visual device in the late 1800s and listing some errors in its use in modern texts, I will build on a recent article by Daniel F. Styer. It succinctly summarizes objections from statistical mechanics to characterizing higher entropy conditions as disorderly (1). Then, after citing many failures of “disorder” as a criterion for evaluating entropy—all educationally unsettling, a few serious, I urge the abandonment of order–disorder in introducing entropy to beginning students. Although it seems plausible, it is vague and potentially misleading, a non-fundamental description that does not point toward calculation or elaboration in elementary chemistry, and an anachronism since the introduction of portions of quantum mechanics in first-year textbooks.3 Entropy’s nature is better taught by first describing entropy’s dependence on the dispersion of energy (in classic thermodynamics) and the distribution of energy among a large number of molecular motions relatable to quantized states, microstates (in molecular thermodynamics). 4 Increased amounts of energy dispersed among molecules result in increased entropy that can be interpreted as molecular occupancy of more microstates. (High-level first-year texts could go further to a page or so of molecular thermodynamic entropy as described by the Boltzmann equation.) The History and Use of “Disorder” to Characterize Entropy As is well known, in 1865 Clausius gave the name “entropy” to a unique quotient for the process of a reversible change in thermal energy divided by the absolute temperature (2). He could properly focus only on the behavior of chemical systems as macro units because in that era there was considerable doubt even about the reality of atoms. Thus, the behavior of molecules or molecular groups within a macro system was totally a matter of conjecture (as Rankine unfortunately demonstrated in postulating “molecular vortices”) (3). Later in the 19th century, but still prior to the development of quantum mechanics, the greater “disorder” of a gas at high temperature compared to its distribution of velocities at a lower temperature † Professor Emeritus, Occidental College, Los Angeles, CA 90041. was chosen by Boltzmann to describe its higher entropy (4). However, “disorder” was a crutch; that is, it was a contrived support for visualization rather than a fundamental physical or theoretical cause for a higher entropy value. Others followed Boltzmann’s lead; Helmholtz in 1882 called entropy “Unordnung” (disorder) (5), and Gibbs Americanized that description with “entropy as mixed-up-ness”, a phrase found posthumously in his writings (6 ) and subsequently used by many authors. Most general chemistry texts today still lean on this conceptual crutch of order–disorder either slightly with a few examples or as a major support that too often fails by leading to extreme statements and overextrapolation. In the past century, the most egregious errors of associating entropy with disorder occurred simply because disorder is a common language word with nonscientific connotations. Whatever Boltzmann meant by it, there is no evidence that he used disorder in any sense other than strict application to molecular energetics. But over the years, popular authors have learned that scientists talked about entropy in terms of disorder, and thereby entropy has become a code word for the “scientific” interpretation of everything disorderly from drunken parties to dysfunctional personal relationships,5 and even the decline of society.6 Of course, chemistry instructors and authors would disclaim any responsibility for such absurdities. They would insist that they never have so misapplied entropy, that they used disorder only as a visual or conceptual aid for their students in understanding the spontaneous behavior of atoms and molecules, entropy-increasing events. But it was not a social scientist or a novelist—it was a chemist—who discussed entropy in his textbook with “things move spontaneously [toward] chaos or disorder”.7 Another wrote, “Desktops illustrate the principle [of ] a spontaneous tendency toward disorder in the universe”.7 It is nonsense to describe the “spontaneous behavior” of macro objects in this way: things like sheets of paper, immobile as they are, behave like molecules despite the fact that objects’ actual movement is non-spontaneous and is due to external agents such as people, wind, and earthquake. That error has been adequately dismissed (7 ). The important point here is that this kind of mistake is fundamentally due to a focus on disorder rather than on the correct cause of entropy change, energy flow toward dispersal. Such a misdirected focus leads to the kind of hyperbole one might expect from a science-disadvantaged writer, “Entropy must therefore be a measure of chaos”, but this quote is from an internationally distinguished chemist and author.7,8 Entropy is not disorder. Entropy is not a measure of disorder or chaos. Entropy is not a driving force. Energy’s diffusion, dissipation, or dispersion in a final state compared to an initial state is the driving force in chemistry. Entropy is the index of that dispersal within a system and between the system and its surroundings.4 In thermodynamics, entropy change is a quotient that measures the quantity of the unidirectional flow of thermal energy by dS ≥ dq/T. An appropriate paraphrase would be “entropy change measures energy’s dispersion at a stated temperature”. This concept of energy dispersal is not limited to thermal energy transfer between JChemEd.chem.wisc.edu • Vol. 79 No. 2 February 2002 • Journal of Chemical Education 187 In the Classroom system and surroundings. It includes redistribution of the same amount of energy in a system—for example, when a gas is allowed to expand into a vacuum container, resulting in a larger volume. In such a process where dq is zero, the total energy of the system has become diffused over a larger volume and thus an increase in entropy is predictable. (Some call this an increase in configurational entropy.) From a molecular viewpoint, the entropy of a system depends on the number of distinct microscopic quantum states, microstates, that are consistent with the system’s macroscopic state. (The expansion of a gas into an evacuated chamber mentioned above is found, by quantum mechanics, to be an increase in entropy that is due to more microstates being accessible because the spacing of energy levels decreases in the larger volume.) The general statement about entropy in molecular thermodynamics can be: “Entropy measures the dispersal of energy among molecules in microstates. An entropy increase in a system involves energy dispersal among more microstates in the system’s final state than in its initial state.” It is the basic sentence to describe entropy increase in gas expansion, mixing, crystalline substances dissolving, phase changes, and the host of other phenomena now inadequately described by “disorder” increase. In the next section the molecular basis for thermodynamics is briefly stated. Following it are ten examples to illustrate the confusion that can be engendered by using “disorder” as a crutch to describe entropy in chemical systems. The Molecular Basis of Thermodynamics The four paragraphs to follow include a paraphrase of Styer’s article “Insight into Entropy” in the American Journal of Physics (1).9 In statistical mechanics, many microstates usually correspond to any single macrostate. (That number is taken to be one for a perfect crystal at absolute zero.) A macrostate is measured by its temperature, volume, and number of molecules; a group of molecules in microstates (“molecular configurations”, a microcanonical ensemble) by their energy, volume, and number of molecules. In a microcanonical ensemble the entropy is found simply by counting: one counts the number W of microstates that correspond to the given macrostate10 and computes the entropy of that macrostate by Boltzmann’s relationship, S = kB ln W, where kB is Boltzmann’s constant.11 Clearly, S is high for a macrostate when many microstates correspond to that macrostate, whereas it is low when few microstates correspond to the macrostate. In other words, the entropy of a macrostate measures the number of ways in which a system can be different microscopically (i.e., molecules be very different in their energetic distribution) and yet still be a member of the same macroscopic state. To put it mildly, considerable skill and wise interpretation are required to translate this verbal definition into quantitative expressions for specific situations. (Styer’s article describes some conditions for such evaluations and calculations.) Nevertheless, the straightforward and thoroughly established conclusion is that the entropy of a chemical system is a function of the multiplicity of molecular energetics. From this, it is equally straightforward that an increase in entropy is due to an increase in the number of microstates in the final macrostate. This modern description of a specifiable increase 188 in the number of microstates (or better, groups of microstates) contrasts greatly with any common definition of disorder, even though disorder was the best Boltzmann could envision in his time for the increase in gas velocity distribution. There is no need today to confuse students with 19th century ad hoc ideas of disorder or randomness and from these to create pictures illustrating “molecular disorder”. Any valid depiction of a spontaneous entropy change must be related to energy dispersal on a macro scale or to an increase in the number of accessible microstates on a molecular scale. Examples of “Disorder” as a Broken Crutch for Supporting Illustrations of Entropy 1. Entropy Change in a Metastable Solid–Liquid Mixture ( 1) This example, a trivial non-issue to chemists who see phenomena from a molecular standpoint and always in terms of system plus surroundings, can be confusing to naive adults or beginning chemistry students who have heard that “entropy is disorder”. It is mentioned only to illustrate the danger of using the common language word disorder. An ordinary glass bowl containing water that has cracked ice floating in it portrays macro disorder, irregular pieces of a solid and a liquid. Yet the spontaneous change in the bowl contents is toward an apparent order: in a few hours there will be only a homogeneous transparent liquid. Of course, the dispersal of energy from the warmer room surroundings to the ice in the system is the cause of its melting. However, to the types of individuals mentioned who have little knowledge of molecular behavior and no habit pattern of evaluating possible energy interchange between a system and its surroundings, this ordinary life experience can be an obstacle to understanding. It will be especially so if disorder as visible non-homogeneity or mixed-up-ness is fixed in their thinking as signs of spontaneity and entropy increase. Thus, in some cases, with some groups of people, this weak crutch can be more harmful than helpful. A comparable dilemma (to those who have heard only that “entropy is disorder” and that it spontaneously increases over time) is presented when a vegetable oil is shaken with water to make a disorderly emulsion of oil in water (8b). However (in the absence of an emulsifier), this metastable mixture will soon separate into two “orderly” layers. Order to disorder? Disorder to order? These are not fundamental criteria or driving forces. It is the chemical and thermodynamic properties of oil and of water that determine such phase separation. The following examples constitute significantly greater challenges than do the foregoing to the continued use of disorder in teaching about entropy. 2. Expansion of a Gas into a Vacuum (9) When this spontaneous process is portrayed in texts with little dots representing molecules as in Figure 1, the use of disorder as an explanation to students for an entropy increase becomes either laughable or an exercise in tortuous rationalization. Today’s students may instantly visualize a disorderly mob crowded into a group before downtown police lines. How is it that the mob becomes more disorderly if its individuals spread all over the city? Professors who respond with their definition must realize that they are particularizing a common word that has multiple meanings and even more Journal of Chemical Education • Vol. 79 No. 2 February 2002 • JChemEd.chem.wisc.edu In the Classroom 1 1 2 2 Vacuum SA SA ∆S 2-1 > 0 2SA ∆S 2-1 = 0 Figure 1. Expansion of a gas into a vacuum. Figure 2. More disorderly? implications. As was well stated, “We cannot therefore always say that entropy is a measure of disorder without at times so broadening the definition of ‘disorder’ as to make the statement true by [our] definition only” (10). Furthermore, the naive student who has been led to focus on disorder increase as an indicator of entropy increase and is told that ∆S is positive in Figure 1 could easily be confused in several ways. For example, there has been no change in the number of particles (or the temperature or q), so the student may conclude that entropy increase is intensive (besides the Clausius equation’s being “erroneous”, with a q = 0). The molecules are more spread out, so entropy increase looks as if it is related to a decrease in concentration. Disorder as a criterion of entropy change in this example is even worse than a double-edged sword. How much clearer it is to say simply that if molecules can move in a larger volume, this allows them to disperse their original energy more widely in that larger volume and thus their entropy increases. Alternatively, from a molecular viewpoint, in the larger volume there are more closely spaced— and therefore more accessible—microstates for translation without any change in temperature. In texts or classes where the quantum mechanical behavior of a particle in a box has been treated, the expansion of a gas with N particles can be described in terms of microenergetics. Far simpler for other classes is the example of a particle of mass m in a one-dimensional box of length L (where n is an integer, the quantum number, and h is Planck’s constant): E = (n2h2)/(8mL 2). If L is increased, the possible energies of the single particle get closer together. As a consequence, if there were many molecules rather than one, the density of the states available to them would increase with increasing L. This result holds true in three dimensions, the microstates become closer together, more accessible to molecules within a given range of energy. beginning students can be seen to be broken—not just weak. (Generally, as in this example, entropy is extensive. However, its additivity is not true for all systems [11a].) 3. Doubling the Amount of a Gas or Liquid, in Terms of Disorder Does any text that uses disorder in describing entropy change dare to put dots representing ideal gas molecules in a square, call that molecular representation disorderly, attach it to another similar square while eliminating the barrier lines, and call the result more disorderly, as in Figure 2? Certainly the density of the dots is unchanged in the new rectangle, so how is the picture more “disorderly”? In the preceding example, if the instructor used a diagram involving molecular-dot arrangements, an implication any student could draw was that entropy change was like a chemical concentration change; entropy was therefore an intensive property. However in this example, the disorder description of entropy must be changed to the opposite, to be extensive! With just these two simple examples, the crutch of disorder for categorizing entropy to 4. Monatomic Gases: Massive versus Light Atoms ( 1, but with Helium Atoms) Helium atoms move much more rapidly than do atoms of krypton at the same temperature. Therefore, any student who has been told about disorder and entropy would predict immediately that a mole of helium would have a higher entropy than a mole of krypton because the helium atoms are so much more wildly ricocheting around in their container. That of course is wrong. Again, disorder proves to be a broken crutch to support deductions about entropy. Helium has a standard-state entropy of 126 J K᎑1 mol᎑1, whereas krypton has the greater S °, 164 J K᎑1 mol᎑1. The molecular thermodynamic explanation is not obvious but it fits with energetic considerations, whereas “disorder” does not. The heavier krypton actually does move more slowly than helium. However, krypton’s greater mass, and greater range of momenta, results in closer spacing of energy levels and thus more microstates for dispersing energy than in helium. 5. The Crystallization of Supercooled Water, a Metastable System NOTE: In this example and the one that follows, students are confused about associating entropy with order arising in a system only if they fail to consider what is happening in the surroundings (and that this includes the solution in which a crystalline solid is precipitating, prior to any transfer to the environment). Thus they should be repeatedly reminded to think about any observation as part of the whole, the system plus its surroundings. When orderly crystals form spontaneously in these two examples, focusing on entropy change as energy dispersal to or from a system and its surroundings is clearly a superior view to one that depends on a superficiality like disorder in the system (even plus the surroundings). Example 7 is introduced only as a visual illustration of the failure of order–disorder as a reliable indicator of entropy change in a complex system. Students who believe that spontaneous processes always yield greater disorder could be somewhat surprised when shown a demonstration of supercooled liquid water at many degrees below 0 °C. The students have been taught that liquid water is disorderly compared to solid ice. When a seed of ice or a speck of dust is added, crystallization of some of the liquid is immediate. Orderly solid ice has spontaneously formed from the disorderly liquid. Of course, thermal energy is evolved in the process of this thermodynamically metastable state changing to one that is stable. Energy is dispersed from the crystals, as they form, JChemEd.chem.wisc.edu • Vol. 79 No. 2 February 2002 • Journal of Chemical Education 189 In the Classroom to the solution and thus the final temperature of the crystals of ice and liquid water is higher than originally. This, the instructor ordinarily would point out as a system–surroundings energy transfer. However, the dramatic visible result of this spontaneous process is in conflict with what the student has learned about the trend toward disorder as a test of spontaneity. Such a picture might not take a th ...
Purchase answer to see full attachment

Final Answer

The answer is edited a little bit and I updated the following answer marked as the final one. Please use the following answer instead. Thank you.

Running head: READING REFLECTION

Reading Reflection
Course’s Name
Student’s Name
Professor’s Name
Institution
Due Date

1

Running head: READING REFLECTION

2

Reading Reflection
The purposes of the author in writing his article were to provide better and more
accurate visualization of the increase in entropy and to point out that the definition of
entropy in many textbooks is incorrect, and these purposes were started by decry...

Ace_Tutor (5145)
Cornell University

Anonymous
I was on a very tight deadline but thanks to Studypool I was able to deliver my assignment on time.

Anonymous
The tutor was pretty knowledgeable, efficient and polite. Great service!

Anonymous
Heard about Studypool for a while and finally tried it. Glad I did caus this was really helpful.

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Brown University





1271 Tutors

California Institute of Technology




2131 Tutors

Carnegie Mellon University




982 Tutors

Columbia University





1256 Tutors

Dartmouth University





2113 Tutors

Emory University





2279 Tutors

Harvard University





599 Tutors

Massachusetts Institute of Technology



2319 Tutors

New York University





1645 Tutors

Notre Dam University





1911 Tutors

Oklahoma University





2122 Tutors

Pennsylvania State University





932 Tutors

Princeton University





1211 Tutors

Stanford University





983 Tutors

University of California





1282 Tutors

Oxford University





123 Tutors

Yale University





2325 Tutors