In 1859, scientist Robert Kirchhoff introduced an interesting
problem into the world of physics: the question of blackbody radiation. A
"blackbody" is basically a black box that absorbs all the radiation
that is directed toward it. The amount of energy that it emits
is independent of the size or shape of the box; it depends only
on temperature.
For decades, physicists worked to figure out the relationship between
the temperature of the blackbody and the distribution of the emitted
energy along the electromagnetic spectrum. This was of particular
interest to theorists because finding the relationship could yield
valuable physical constants that could then be applied to other
physics concerns. However, there was a more concrete and technical
reason to search for a formula relating energy to temperature.
Such an equation could be used as a standard for rating the strength
of electric lamps.
For this reason, the imperial bureau of standards–the
Physikalisch- Technische Reichsanstalt–took a special interest in
finding the formula. And, in 1896, a young German physicist working there,
Wilhelm Wien, seemed to have stumbled onto an equation that worked.
With the knowledge of the spectral distribution of the energy at
one temperature, Wien's equation would produce the distribution
for any other temperature. It was an experimentally accurate theory,
but Wien had no explanation for why his equation worked;
he knew only that it did.
Meanwhile, Planck was hired to take Kirchhoff's old job
at the University of Berlin. Planck spent much of the 1890s studying
problems of chemical thermodynamics,
specifically entropy. His work in this field led him to the puzzle
of blackbody radiation, and he set himself the goal of finding
a workable theory that would yield Wien's equation.
But just as Planck thought he'd found the answer, a series
of experiments proved that Wien's equation was actually incorrect. Rather
than assuming his theory was correct and hoping the empirical data
would eventually prove him right, Planck chose to trust the experimental
results: Wien's theory was wrong, which mean Planck's was, too.
So, in 1900, Planck was forced to start all over again.
At this point, Planck took a revolutionary step, although
he didn't realize it at the time. Unable to get the numbers to
work any other way, he made a bold assumption: Planck posited that
energy was emitted by the black box in tiny, finite packets. This
was an unprecedented move, as it had always been assumed that energy came
in an unbroken continuous wave, not in a series of discrete energy
packets. But the assumption led Planck to an equation that worked,
the equation that would make him famous: E = hv.
In this equation, E stands for the total energy of the
light source, v is the frequency of the light, and h was a mathematical
constant that came to be known as "Planck's constant." If Planck
was right, then energy could only be emitted in certain units–multiples
of hv. Planck called these units "quanta," Latin for "how much."
This equation challenged everything that had been previously thought about
energy. But no one, not even Planck, realized this at the time.
Planck's equation worked, and by 1908, everyone in the
field had accepted it, but even the best physicists of the time
failed to see its implications. Like Planck, they considered the
quantum assumption to be nothing more than a convenience, a mathematical abstraction
with no consequences for the real world.
Despite this oversight, Planck's work was impressive enough
to draw the attention and admiration of his peers. The new equation would,
in itself, have been enough to make Planck's career. Planck's theory
yielded two new universal constants that related mechanical measures
of energy to temperature measures: h and K.
Planck called K "Boltzmann's constant", a gesture
of appreciation to Ludwig Boltzmann, whose theories had led Planck
to his own grand solution. In 1900, the value of h meant little
to physicists, but K meant a great deal.
Knowing that such a constant as K existed,
physicists had composed the equation LKT = pressure
of a standard unit of gas. In this equation, L stands
for the number of molecules in a standard unit of gas and T stands
for the absolute temperature of the gas. They knew that the number
of molecules and the temperature of a gas were directly related
to the pressure it exerted, but they didn't know how, since the
values of both L and K were a
mystery.
Thanks to Planck, physicists could finally derive a value
for L. And knowing L eventually
led to even more discoveries, including a theoretical confirmation
of the charge of a single electron. This was one of the earliest
connections physicists were able to make between electrodynamics
and atomic theory, and bridging the gap between these two fields
had been one of Planck's highest goals.
He wasn't the only one with this goal. As the impact of
Planck's work grew and grew, his peers sat up and took notice.
In 1908, Planck was nominated for the Nobel Prize in physics for
the discovery of his two constants and the E = hv
formula itself. But Planck's nomination was voted down, not because
his work wasn't significant enough, but because someone had finally
realized it had even more significant implications. It was pointed
out to the Nobel committee that Planck's equation implied that
energy did not come in a continuum, and, horrified by the thought,
the committee declined to award Planck the prize. Instead, the
1908 Nobel Prize went to Gabriel Lippman, for his work in the new
field of color photography.
Though he lost the prize in 1908 for being too revolutionary, more
than ten years later, Planck would finally win his Nobel–not in spite
of the revolution his theory was about to cause, but because of it.