There's another sense in which this slogan
is true
which is that all physical systems contain
information.
I gave specific examples of this, switch
open or closed contains information,
the presence or the absence of a photon -
a particle of light - contains information.
Photon polarized like this or photon polarized
like that contains information.
But this discovery, that all physical systems
contain information,
is actually a very old discovery.
If I have a physical system that has n bits,
then that implies it has two to the n,
which I'll represent as being capital n,
because capital n is bigger than little n.
It has two to the N possible states.
Conversely, if I have a system that has N
states, then it can be used to represent
log to the base two of N bits.
The logarithm is basically the total number
of bits we need to represent this number.
So, for instance, if N is equal to eight, then
log to the base two of eight is equal to three.
It's the number of bits I need to write out
eight,
because eight is equal to 1 0 0 0. It's just
the number of zeros in the bits.
So when we have a system that has not just
two states or four states or eight states or
sixteen states or a thousand and twenty-four
states
it still represents information.
If I have a system that has three states,
for example, yes, no and maybe, then it
represents log to the base two of three,
which is equal to one plus change bits.
It's in between log to the base two of two,
which is equal to one bit,
and log to the base to of four, which is
equal to two bits.
So something that has three different states,
it's not binary, it's tertiary,
then it has a number of bits in between one
and two,
and this observation that any physical system
that has a different number of distinguishable
states can register a certain amount of
information
was actually realized back in the 19th century.
Back in the 19th century, physicists such
as Maxwell and Boltzmann, in Vienna,
Maxwell is in Cambridge, or Gibbs at Yale, they
realized that there was this funny quantity called entropy.
Now what is entropy?
Entropy is something that messes up your
ability to do work.
It's some measure of disorder.
Or of randomness.
The way that people came up with the concept
of entropy in the middle of the 19th century
was that they were looking at how energy could
be converted to other forms of energy.
These scientists and others were interested in
how heat could be transformed into work.
There's a famous result called the Second Law
of Thermodynamics.
The Second Law of Thermodynamics basically
says that entropy increases, or entropy tends
to increase, let's say.
But what is entropy?
So back in the mid-19th century, these scientists
realized that energy and heat was actually the
kinetic energy, the microscopic energy of
moving molecules.
Molecules bouncing off of each other, so
they had a certain kind of kinetic energy,
the energy of motion, and that we could
identify the energy in heat with this
energy of motion with all the individual molecules
in a gas, like the gas in this room.
Now, that's a certain amount of energy, they
were able to measure this amount of energy,
and they also were able to know that you could
turn this energy into work, using, for instance,
the steam engine.
You got a piston with a cylinder, here's all
these steam molecules bouncing off back
and forth, you move out the piston, the
molecules gradually slow down,
energy gets smaller, and energy goes into
work in the steam engine.
But not all of this energy could be converted
into work.
Why is that?
It's because these molecules bouncing off
of each other,
they have some degree of disorder, and this
degree of disorder could not be decreased.
So somehow, to have this degree of disorder,
even when the piston goes out,
they still have to be moving around,
and so you can't extract all of the kinetic
energy from these molecules.
Now because these guys, they were guys,
by the way, in the course of these little
lectures, I'm going to use the word "guys"
to refer not only to people, men, but also
to elementary particles and to women as well.
So these guys figured out that if they wanted
to come up with a formula for entropy,
to describe this quantity that never decreased.
They said, well let's let N equal the number
of possible states or configurations
or as Boltzmann put it, "complexions", of
these molecules in the gas.
If they defined a quantity, S, which is equal
to the logarithm of N, which you may note
is the amount of information that's contained
in these molecules in the gas,
just be counting the number of possible
states, configurations or complexions,
this quantity right here, the log of N, is
the number of bits that's required to label
each of the possible different complexions,
or states of the molecules in the gas,
and they said if we define S to be proportional
to this, they defined a constant k, which is now
called Boltzmann's constant.
If they defined this S to be k log N, then
this was actually this quantity entropy,
this thermodynamic quantity that gunged up
the works of heat engines and prevented you
from getting all the energy of molecules.
So famous, in fact, that it actually appears
on Boltzmann's grave.
He died in 1906 by his own hand, he was a
depressive person, after a visit to the US,
I'm not sure if there's any correlation between
this.
But back in those days if you had a famous
formula, people popped it on your grave.
Actually there is an interesting story, which
was that this constant k that's on Boltzmann's
grave was actually by the famous German
physicist Max Planck,
and it was originally called Planck's constant.
But Planck also had another constant named
after him, h,
which we'll see again, because it governs
the quantum mechanical behavior of things.
So after a while people just said, hey let's
just call it Boltzmann's constant.
What Boltzmann would feel if he saw some other
man's constant on his grave, I don't know,
maybe he be turning over in his grave.