Skip over navigation

Thermodynamics: Building Blocks

Quantum Basis

Terms and Formulae

Problems

Quantum Mechanics governs the microscopic behavior of particles and atoms and their interactions. The results of Classical Mechanics are true only because they are the statistical averages over the quantum behavior underlying the system.

Similarly, we can gain a better understanding of Thermodynamics and the explanations for the macroscopic behavior of systems by first understanding how systems behave on the microscopic, quantum level.

Before we look at the fundamental assumption of Thermodynamics, we must first define a few terms that are crucial to understanding what it says. The term closed system refers to a system that maintains a constant number of particles, constant energy, constant volume, and is free from any change in influences external to the system, such as an oscillating magnetic field. A quantum state is the minimal collection of information about a system that is maximally informative. For example, in classical mechanics one need only specify the position and momentum of a particle to fully describe its behavior for all time; the collection of this data details the state of the system.

The Fundamental Assumption

The Fundamental Assumption states that any closed system has an equal probability to be in any of its possible quantum states.

The Fundamental Assumption is quite simple--there is no reason that a system would prefer a given state over any other state, provided both are possible. However, the statement is powerful in that we can now count the states available to a system and subsequently make statements about the probability of being in a particular state. We will investigate this application through a quantum model of spins.

Binary Systems

Let us suppose that we have a system consisting of N magnets, each of which is localized and attached to a separate site. Each magnet has a magnetic moment whose magnitude is m . Think of each magnet as a vector of magnitude m . We won't focus on the details of the Electromagnetism here but on the statistics that rule our system.

Calling the system binary means that each magnet can be oriented either in the "up" position or the "down" position, and no other. If a magnet is in the down position, then we say that its magnetic moment is - m , if up, it is + m . The magnets do not interact with each other; i.e. the position of a magnet's neighbors does not influence its position. A sample collection of such magnets can be seen in .

Figure %: Binary System of Magnets

Magnetic moments add together just as vectors do. Therefore, we can ask, how many ways are there to have a total magnetic moment M of M = Nm ? Such a state would require all of the magnets to be in the up position, so there is only one way to achieve this state. How many ways are there to have a total magnetic moment of M = (N - 2)m ? Such a state requires one magnet to be in the down position. Since there are N magnets, there are N such ways.

Letting C represent the up position and D represent the down, we can use a shorthand notation for representing all of the possible states of the system:

(C + D)N

Using a binomial expansion, and writing in summation notation, we can write:

(C + D)N = C N-i D i

The Multiplicity Function

Usually we are interested not in writing out a general form for all states, but are more focused on one particular state. As we saw above, sometimes there are multiple states with the same number of spins in the up position. Let N up be the number of particles in the "up" state, and N down be the number of particles in the "down" state (then N = N up + N down ). We refer to the number of states with the same values of N and N up by the function g(N, N up) , called the multiplicity function. For our system, g(N, N up) is given by the coefficient in the preceding sum:

g(N, N up) =

Notice that for very large and very small values of N up , g is small, but for N up = N down , g is a maximum.

Probability

You may be wondering what this has to do with thermodynamics. In large systems, we can't know exactly what each particle is going to do for all time. To describe just one state of the gas molecules in a room, for example, we would need to specify over 1026 variables, namely the position and velocity of each molecule, and that ignores internal degrees of freedom such as rotation! Therefore when we look at large systems, we need to look at probability and what states and configurations are most likely.

Consider a system with g possible states. Let s be a label that indicates the state of a given system. Then let P(s) be the probability that the system is in state s , given by:

P(s) =

Therefore, if there are 100 states, the probability of being in any particular state is P(100) = 0.01 .

Notice that the probability that the system is in some state, any state, must be one, since we are certain to find the system in one of the g states at any time. Therefore, we write:

P(s) = 1

Average Value

A key application of probability in thermodynamics is determining the average value of a property. Let A represent any property of a system. A could be the energy, the number of particles, etc. We use the notation < A > to denote the average value of A . Then we can write:

< A > = A(s)P(s)

Ensembles

One helpful way to make sense of average values and probabilities is in terms of an ensemble of systems. If there are g possible states, imagine that there are g identical systems, each of which is exactly similar to the original system. (Note that the systems being identical does NOT mean that they are all in the same possible state at the same time.) Then to say that A takes on an average value of < A > is to say that adding up the values of A in each of the systems in the ensemble, and dividing by the number of such systems, yields < A > .

Follow Us