In the mid-nineteenth century, Scottish physicist James
Clerk Maxwell discovered that electric and magnetic forces are unified
in the electromagnetic field. Maxwell calculated that light moves
at a fixed speed: it *always* travels at the speed
of light. Visible light is just another kind of electromagnetic
wave, since stationary light doesn’t exist.

This formulation troubled young Albert Einstein. What
happens, he wondered, if we chase after a beam of light at light
speed? After a decade of contemplating Maxwell’s definitions of
light and motion, in June 1905 Einstein found a way of understanding
how the world appears to observers who are moving relative to each other.
He concluded that the moving observer experiences time more slowly
than a stationary observer does. This concept is called *time
dilation*; the shorter length of the moving observer is
called the *Lorentz contraction*. This answer, formulated
when Einstein was twenty-six years old, upended all traditional
understandings of space and time.

The discrepancy between the moving observer and the stationary observer underlies Einstein’s theory of special relativity, which says that if you want to measure speed accurately, you must always specify who is doing the measuring. Why? Because, as Einstein showed, the concept of motion is always relative. There is no such thing as an absolute frame of reference when it comes to objects moving in space. Force-free motion has meaning only when compared to other motions, and the same is true for accelerated motion. At the crux of relativity is the idea that simultaneous observations by no means yield identical viewpoints.

With a series of helpful examples, Greene shows that relativity
is a difficult concept to understand on an intuitive level. People
must give up the notion that all observers, regardless of their
state of motion, can see things simultaneously. According to the
special theory of relativity, things that are simultaneous to one
observer need not be simultaneous to the other, depending on both
observers’ state of motion. Relativity hinges on a complete symmetry* *between observers.

One important exception to relativity is the constancy
of the speed of light. Light travels at 670 million miles an hour
(186,000 miles per second) *no matter what*. The
importance of this discovery cannot be overemphasized. It answered
Einstein’s adolescent question: no matter how fast you chase after
a light beam, it will still retreat at light speed. The discovery
of this constant led to a complete overhaul of physicists’ understanding
of the universe and, in time, to the undoing of Newtonian mechanics.

Time is measured by clocks, which undergo motion at a constant velocity. But because motion influences the passage of time, a “universal clock” cannot exist. Time passes more slowly for an individual in motion than it passes for an individual at rest. This principle applies not just to ticking clocks, but also to human activity and the decay of the body. Muons moving at high speed disintegrate slower than those at low speed, but—and here is the paradox—both particles experience exactly the same quantity of life. To understand this concept, think of a person who lives for 500 years and reads ten times more slowly than a person who lives for fifty years. Although the slow reader lives much longer than the fast reader, both read exactly the same number of books.

Einstein’s famous equation, *E = mc*
2,
demonstrated that energy (*E*) is equivalent to mass
(*m*) multiplied by the speed of light squared. His
special theory of relativity showed that space and time, rather
than being separate and autonomous, are in fact entwined and mutually
dependent, or relative. The faster something moves, the more energy
it gains; the more energy something has, the more massive it grows.
Greene uses the expression “convertible currencies” to show that
energy and mass, like dollars and euros, fluctuate depending on
the other’s status. But unlike money, the “exchange rate” between
energy and mass is fixed by the speed of light (*c*
2).

Einstein exposed the shortcomings of our intuitions about motion and transformed our understanding of space and time. But resolving the conflict of our intuition about motion and the constancy of the speed of light was only the first of Einstein’s problems. His suggestion that nothing can outrun light stood in direct contradiction to Isaac Newton’s long-accepted universal theory of gravity. It took Einstein another decade to come up with his general theory of relativity, which showed how space and time warp to create gravity.

In the seventeenth century, Newton modernized methods of scientific research by rigorously applying mathematical principles to the physical world. Newton considered gravity the “great equalizer,” arguing that everything in the physical universe exerts an attractive gravitational force on everything else. He wrote equations showing that the gravitational force between two objects is directly proportional to the product of their masses and inversely proportional to the square of the distance between them.

Early on, Einstein understood that this Newtonian law
of gravity was inconsistent with special relativity, which hinges
on the constancy of the speed of light. If no information can be
transmitted instantaneously because nothing runs faster than the
speed of light, there was something wrong with Newton’s conception
of gravity as an* *instantaneous* *effect.
Newton’s law directly contradicted this foundational principle of
special relativity.

Einstein saw that for all the brilliance of Newton’s theories and mathematical proofs concerning how objects behave under gravity, Newton had failed to explain what gravity was. Newton understood gravity’s effects, but not its components or its internal workings. He believed that gravity was caused by an agent and not by a force acting at a distance. Einstein suggested that gravity was not, in fact, a force, but was a distortion of space that forced objects such as planets into orbitary paths around the sun.

How do objects experiencing accelerated* *motion,
Einstein asked, complicate our understanding of gravity? Gravity
is mysterious, but accelerated motion isn’t. Einstein made his first
breakthrough on this subject in 1912, when he first established
the ways in which gravity and accelerated motion resemble each other.
If accelerated motion warps space and time (as special relativity showed),
then gravity might perform exactly the same function. Einstein found
that it is impossible to distinguish between uniformly accelerated
motion and gravity; he called this discovery the *equivalence
principle*. (To understand this principle, think of standing
in an elevator that is accelerating upward. The force you would
feel on your feet would be virtually indistinguishable from gravity.)

General relativity puts all possible observational vantage points on equal footing. The connection between accelerated motion and gravity is what led Einstein to an understanding of general relativity. Einstein realized that since no discernable difference exists between accelerated motion and gravity, all observers, despite their state of motion, can state that they are at rest and the world is moving by them. If a person is in an elevator moving upward, for example, she can say that she is stationary and the force of gravity is pulling her downward.

Matter, Einstein declared, is what creates curves in spacetime. As a thin membrane would be distorted by the bulk of a bowling ball, the fabric of space is distorted by the presence of a massive object like the sun. The shape of that distortion determines the earth’s motion and much else besides. This is how Einstein isolated the mechanism by which gravity is transmitted: he showed that space, rather than being a passive background for the universe’s movements, responds to objects in its environment. Both time and space are warped by objects moving within it. Einstein equated this warping with gravity. At the time, this theory was extremely radical.

Einstein’s theory of general relativity predicts that the sun will warp the space and time surrounding it, and that this warping will alter the path of the starlight. In 1919, Sir Arthur Eddington tested Einstein’s prediction during a solar eclipse. Eddington’s methods were later called into question, but at the time, it was believed that he had proved Einstein’s prediction. Einstein, a Swiss patent clerk, had arrived at his hour of glory.

Karl Schwarzchild, when studying Einstein’s theories, predicted the existence of black holes, or compressed stars with all-consuming gravitational fields. Objects can avoid a black hole’s rapacity if they are at a safe distance from its event horizon, but matter moving too close will fall in. Nothing can escape the black hole, not even light; hence its name. Evidence suggests that there is a massive black hole in the center of the Milky Way galaxy that is 2.5 million times larger than the sun. Many scientists believe much larger ones exist.

General relativity also has some bearing on the origin of the universe. Einstein studied the equations of nineteenth-century mathematician George Bernhard Riemann and discovered that the universe seemed to be getting larger. Disturbed by this evidence, Einstein returned to his equations and added a cosmological constant, which restored the illusion of a spatially static universe. Twelve years later, however, American astronomer Edwin Hubble proved decisively that the universe was in fact expanding. Einstein cited his imposition of the cosmological constant as the biggest mistake of his life.

The universe, ever-expanding, began as a point (or something like it) in which all matter was compressed with incredible density. Then a cosmic fireball, known as the big bang, exploded. From that event, the universe as we know it evolved.

But before we can embrace the huge complexity and significance of general relativity, we must confront the stumbling block that Greene describes as the central conflict of modern physics: the fact that general relativity is incompatible with quantum mechanics. This incompatibility prevents physicists from truly understanding what occurred at the instant of the big bang. It also points to a defect in our formulation of nature’s inner workings.

Before explaining exactly how general relativity is inconsistent
with quantum mechanics, Greene first introduces the intricacies
of quantum mechanics. He describes in great detail the astonishing
qualities that the universe exhibits when it is studied at the atomic
and subatomic levels—so astonishing, in fact, that physicists still
haven’t made sense of them. At the beginning of the twentieth century,
German physicist* *Max Planck first began to lay
out a conceptual framework to describe how the universe operated
in the microscopic realm. By 1928, most of the mathematical equations
for quantum mechanics had been laid out, but to this day very few
scientists fully grasp why quantum mechanics works. Many basic concepts
in our everyday world lose all meaning on microscopic scales, and
quantum physics is even more difficult to understand than general
relativity. Niels Bohr, one of the pioneers of quantum physics,
once said that if you don’t get dizzy when thinking about quantum
mechanics, then you haven’t really understood it.

Greene reviews the first paradox of quantum mechanics: for any given temperature, the total energy involved is infinite. So why doesn’t all of matter exist at an infinite temperature all the time? Because, Greene explains, energy comes in specific denominations, or “lumps”; fractions aren’t allowed. Wavelengths come in whole numbers. Each of the allowed waves, regardless of wavelength (which is defined as the distance between the wave’s successive peaks or troughs), carries the same amount of energy.

A wave’s minimum energy is proportional to its frequency,
which means that long-wavelength radiation has less energy than
short-wavelength radiation. Above a certain threshold of energy,
the discrete lumps can make no contribution. Planck’s constant (written
as an “h-bar”)* *describes the proportionality factor
between the frequency of a wave and the minimal amount of energy
it can have: in everyday units, the h-bar comes to about a billionth
of a billionth of a billionth, which means that the energy lumps
involved are extremely tiny.

At the turn of the twentieth century, Planck’s calculations showed that this lumpiness prevented the possibility of infinite total energy. This strange discovery—or, more accurately, educated guess—precipitated the collapse of classical physics.

Einstein worked very hard to incorporate Planck’s lump
description of energy into a new description of light. A light beam,
Einstein declared, should be conceived of as a packet, or stream,
of light particles, which are also known as photons. Einstein then
demonstrated that Planck’s description of energy lumps reflects
a basic feature of electromagnetic waves: they are made up of photons
that are actually little packets of light, which came to be known
as* *quanta. By introducing photons, Einstein—the
scientist who toppled Newton’s theory of gravity—revived Newton’s
long-since-discredited particle model of light. In the early nineteenth
century, English physicist Thomas Young had disproved Newton’s hypothesis
by showing an interference pattern, which suggested that light had
wave properties. Later, scientists found that this theory remains applicable
even if the photons pass through one at a time. It is the color
of the light and not its intensity that determines whether or not photoelectric
effect occurs.

Einstein’s particle model of light differed from Newton’s
in one key respect: Einstein argued that photons were particles *and* had wavelike
features. The intuition that something must be either a wave or
a particle is incorrect. Light has both wavelike and particle-like
properties.

In 1923, Prince Louis de Broglie studied Einstein’s theory of wave-particle duality of light and proposed that all matter has this dual quality. Several years later, Clinton Davisson and Lester Germer proved experimentally that electrons—normally thought to be straightforward particles—also exhibit interference phenomena, which again suggests the existence of waves. Davisson and Germer’s experiment corroborated de Broglie’s suggestion by showing that all matter has a wavelike character and exhibits the same curious duality that light does.

Erwin Schrödinger suggested that waves were really “smeared” electrons.
In 1926, the German physicist Max Born built on Schrödinger’s idea
and in the process introduced one of the most bizarre aspects of
quantum theory, asserting that electrons and matter in general must
be considered in terms of* *probability. If matter
is composed of waves, then it can only be described in terms of
probability. Probability waves came to be known as wave functions.

If we follow Born’s theory to its logical conclusion, we see that quantum mechanics can never predict the exact outcomes of experiments; scientists can only perform the same trials over and over again until arriving at a set of laws. Einstein thought this conclusion was too random and vague to accept, so he dismissed it with one of his most famous lines: “God does not play dice with the universe.” Einstein decided that Born’s probability thesis indicated a defect in human understanding.

In subsequent years, experiment has invalidated Einstein’s
skepticism, but to this day, scientists argue about what all this
randomness means. In the years following World War II, Richard Feynman* *clarified
the probabilistic core of quantum mechanics. He believed that attempts
to localize an electron perturb it and change the direction of its
movement and, consequently, the outcome of the experiment. Revisiting
Thomas Young’s nineteenth-century double-slit experiment, which
had initially established the wave nature of light, Feynman challenged
the basic classical assumption that each electron goes through either
the right or the left slit. Feynman declared instead that each electron
that reaches the phosphorescent screen goes through *both* slits,
traveling along every possible path simultaneously. Feynman knew
that, from a logical standpoint, his suggestion would strike many
doubters as absurd, but he himself was able to embrace the chaos
and absurdity of nature. (Feynman’s idea, we will see, was an important
precursor to string theory.)

Feynman’s conclusion was quite odd—and it is another reason quantum
mechanics remains so difficult to grasp on a visceral level. Only
the *uncertainty principle*, which German physicist
Werner Heisenberg discovered in 1927, supplies an intuitive toehold. Greene
thinks that the uncertainty principle is the single weirdest—and
most evocative—feature of quantum mechanics, so it’s worth describing
in some detail.

The uncertainty principle states that the more precisely a particle’s position is known, the less precisely its momentum is known, and vice versa. It is impossible to know both the position and the velocity of a particle simultaneously. In broader mathematical terms, the uncertainty principle predicts that the act of measuring any one magnitude of a particle—its mass, its velocity, or its position—effectively blurs all the other magnitudes. It is therefore impossible ever to know all of these features with absolute precision.

An effect known as *quantum tunneling* springs
from the uncertainty principle. Quantum tunneling allows a particle
lacking the requisite energy to overcome a barrier to borrow energy,
as long as the energy is swiftly restored to its original source.

In extreme conditions, when things are either extremely massive or extremely miniscule—for example, near the center of black holes (huge), or the entire universe at the moment of the big bang (tiny)—physicists must draw upon both general relativity and quantum mechanics for explanations. By themselves, both theories are inadequate on drastic scales. For this reason, physicists are working to develop a quantum mechanical version of general relativity.

Heisenberg’s *uncertainty principle* marked
a great revolution in the history of physics. The uncertainty principle
describes the universe as more and more chaotic when examined on
smaller and smaller distances and shorter and shorter time scales.
The principle doesn’t only exist in experimental conditions—that
is to say, it doesn’t only exist when physicists tamper with nature
by trying to make measurements, as Feynman found. The uncertainty
principle is intrinsic to nature and always* *in
action, even in the most serene conditions imaginable.

*Quantum claustrophobia* occurs even in
seemingly empty regions of space. On a microscopic level, there
is always a tremendous amount of activity, which becomes increasingly
agitated the more distance and time scales shrink. True emptiness
does not exist anywhere in the universe.

Three highly successful theories form the *standard
model* of particle physics. The only trouble with the standard
model is that it conspicuously excludes gravity from its framework.

The Schrödinger wave equation, one of these theories, was approximate from the outset and did not apply to small microscopic regions. Originally, Schrödinger tried to incorporate special relativity into his conception of quantum mechanics, but he couldn’t make the pieces fit, so he simply left it out. But physicists soon understood that no quantum mechanical framework could be correct without some consideration of special relativity. Because it didn’t consider special relativity, Schrödinger’s approach ignored the malleability and constant motion of all matter.

*Quantum electrodynamics* was developed
to incorporate special relativity into quantum mechanics. Quantum
electrodynamics is an early example of what came to be known as
a *relativistic quantum field theory*: relativistic
because it includes special relativity; quantum because it takes
into account probability and uncertainty; and field theory because
it merges quantum principles into the classical conception of a
force field (Maxwell’s electromagnetic field).

Quantum electrodynamics has proven extremely successful
in predicting natural phenomena. Tochiro Kinoshita has used quantum
electrodynamics to calculate extremely detailed properties of electrons,
which have been verified to an accuracy of better than one part
in a billion. Following the model of quantum electrodynamics, physicists
have tried to develop analogous frameworks for understanding the
strong (*quantum chromodynamics*), the weak (*quantum
electroweak theory*), and the gravitational forces.

Sheldon Glashow, Abdus Salam, and Steven Weinberg formulated
the quantum electroweak theory to unite the weak and the electromagnetic
forces into a common form at high temperatures. At lower temperatures,
the electromagnetic and weak forces crystallize in a different manner
from their high-temp form. This process, called *symmetry-breaking*,
will become important as Greene’s descriptions of string theory
become more nuanced.

In the standard model, messenger particles carry the various
bundles of forces (the smallest bundles of the strong force are
called *gluons*; the bundles for the weak force are
called *weak gauge bosons*,* *known
as W and Z). Photons, gluons, and weak gauge bosons are the microscopic
transmission mechanisms, called *messenger particles*.

Strong, weak, and electromagnetic forces resemble each
other because they are all connected by symmetries, meaning that
two red quarks will interact in exactly the same way if they are
substituted with two green quarks. The universe exhibits *strong
force symmetry*, meaning that physics is completely unaffected
by force-change shifts. The strong force is an example of *gauge
symmetry*.

But what about gravity? Once again, gravity enforces the
symmetry in this scenario, ensuring the equal validity of all frames
of reference. Physicists have called gravity’s messenger particle *graviton*,
though they have yet to observe it experimentally. But in order to
integrate quantum mechanics into general relativity, physicists must
arrive at a quantum field theory of the gravitational force. The standard
model in its current form does not do this.

Everything in the universe, including the gravitational
field and so-called “empty space,” experiences *quantum fluctuations*.
If the gravitational field is the same thing as the shape of space,
quantum jitters mean that the shape of space fluctuates randomly.
These undulations become more pronounced as the spatial focus narrows.* *John
Wheeler* *came up with the term *quantum foam* to
describe the turbulence that ultramicroscopic examination reveals.
The smooth spatial geometry demanded by Einstein’s theory of general
relativity ceases to exist on short-distance scales: the quantum
jitters are just too violent, tearing the very fabric of space with
agitated, irregular movements.

It is the presence of quantum foam that stands in the
way of a theory unifying general relativity with quantum mechanics.
As with most problems of quantum mechanics, these undulations are
not observable in day-to-day experience; the universe appears calm
and predictable. The obstacle only emerges at *Planck length*,
which is a millionth of a billionth of a billionth of a centimeter
(10–33). But however trifling this scale
may seem, quantum foam poses an immense problem. In fact, it creates
the central crisis of modern physics. It is clear that Einstein’s
depiction of space and time as smooth was just an approximation;
the real framework can only emerge at the infinitesimal scale of
the quantum jitters. It is this scale that superstring theory attempts
to explain.

Readers' Notes allow users to add their own analysis and insights to our SparkNotes—and to discuss those ideas with one another. Have a novel take or think we left something out? Add a Readers' Note!