and it will smash to the floor. Push a wagon and it will roll along. Walk to a
wall and you can’t walk through it. There are very basic laws of physics going
on all around us that we instinctively grasp: gravity makes things fall to the
ground, pushing something makes it move, two things can’t occupy the same place
at the same time.
At the turn of the century, scientists thought that all the basic rules
like this should apply to everything in nature — but then they began to study
the world of the ultra-small. Atoms, electrons, light waves, none of these
things followed the normal rules. As physicists like Niels Bohr and Albert
Einstein began to study particles, they discovered new physics laws that were
downright quirky. These were the laws of quantum mechanics, and they got their
name from the work of Max Planck.
In 1900, Max Planck was a physicist in Berlin studying something called
the “ultraviolet catastrophe.” The problem was the laws of physics predicted
that if you heat up a box in such a way that no light can get out (known as a
“black box”), it should produce an infinite amount of ultraviolet radiation. In
real life no such thing happened: the box radiated different colors, red, blue,
white, just as heated metal does, but there was no infinite amount of anything.
It didn’t make sense. These were laws of physics that perfectly described how
light behaved outside of the box — why didn’t they accurately describe this
black box scenario?
Planck tried a mathematical trick. He presumed that the light wasn’t
really a continuous wave as everyone assumed, but perhaps could exist with only
specific amounts, or “quanta,” of energy. Planck didn’t really believe this was
true about light, in fact he later referred to this math gimmick as “an act of
desperation.” But with this adjustment, the equations worked, accurately
describing the box’s radiation.
It took awhile for everyone to agree on what this meant, but eventually
Albert Einstein interpreted Planck’s equations to mean that light can be thought
of as discrete particles, just like electrons or protons. In 1926, Berkeley
physicist Gilbert Lewis named them photons.
This idea that particles could only contain lumps of energy in certain
sizes moved into other areas of physics as well. Over the next decade, Niels
Bohr pulled it into his description of how an atom worked. He said that
electrons traveling around a nucleus couldn’t have arbitrarily small or
arbitrarily large amounts of energy, they could only have multiples of a
standard “quantum” of energy.
Eventually scientists realized this explained why some materials are
conductors of electricity and some aren’t — since atoms with differing energy
electron orbits conduct electricity differently. This understanding was crucial
to building a transistor, since the crystal at its core is made by mixing
materials with varying amounts of conductivity.
Here’s one of the quirky things about quantum mechanics: just because an
electron or a photon can be thought of as a particle, doesn’t mean they can’t
still be though of as a wave as well. In fact, in a lot of experiments light
acts much more like a wave than like a particle.
This wave nature produces some interesting effects. For example, if an
electron traveling around a nucleus behaves like a wave, then its position at
any one time becomes fuzzy. Instead of being in a concrete point, the electron
is smeared out in space. This smearing means that electrons don’t always travel
quite the way one would expect. Unlike water flowing along in one direction
through a hose, electrons traveling along as electrical current can sometimes
follow weird paths, especially if they’re moving near the surface of a material.
Moreover, electrons acting like a wave can sometimes burrow right through a
barrier. Understanding this odd behavior of electrons was necessary as
scientists tried to control how current flowed through the first transistors.
Scientists interpret quantum mechanics to mean that a tiny piece of
material like a photon or electron is both a particle and a wave. It can be
either, depending on how one looks at it or what kind of an experiment one is
doing. In fact, it might be more accurate to say that photons and electrons are
neither a particle or a wave — they’re undefined up until the very moment
someone looks at them or performs an experiment, thus forcing them to be either
a particle or a wave.
This comes with other side effects: namely that a number of qualities for
particles aren’t well-defined. For example, there is a theory by Werner
Heisenberg called the Uncertainty Principle. It states that if a researcher
wants to measure the speed and position of a particle, he can’t do both very
accurately. If he measures the speed carefully, then he can’t measure the
position nearly as well. This doesn’t just mean he doesn’t have good enough
measurement tools — it’s more fundamental than that. If the speed is
well-established then there simply does not exist a well-established position
(the electron is smeared out like a wave) and vice versa.
Albert Einstein disliked this idea. When confronted with the notion that
the laws of physics left room for such vagueness he announced: “God does not
play dice with the universe.” Nevertheless, most physicists today accept the
laws of quantum mechanics as an accurate description of the subatomic world. And
certainly it was a thorough understanding of these new laws which helped Bardeen,
Brattain, and Shockley invent the transistor.