My name is Seth Lloyd, I'm a professor of
quantum mechanical engineering at MIT,
and I am an adjunct faculty member of the
Santa Fe Institute.
And I'm here to tell you about information
and Information Theory.
So it's no secret that the world is awash
with information, we're in the middle of a
information processing revolution based on ultrafast
computers and high bandwidth communications
and smartphones and all these things.
But what is information?
How do we describe it? And how can we deal
with it in a mathematical and scientific manner?
There's a famous quote from Gregory Bateson,
that says, "What is information?"
So is it the difference that makes a difference?
What does this mean? Actually I have no idea
what this means.
The difference that makes a difference.
This is too hard for me.
But let's start out with the idea of difference
that somehow, information, which just comes
from the Latin word to inform, to change the
form of, that information is related to what
makes us ourselves.
What makes me me? Like the genetic information
in my DNA, or the color of my shirt or my lack of hair.
This is a form that I possess. So what is
information?
A good way to think about this is how do we
measure information?
So can we measure information? What is the
unit of information?
A nice way to think about this is in terms
of energy.
So energy is an ancient concept.
It comes from Aristotle.
Energon.
Energon is the internal work. Ergon is work,
means also a worker, and energon is work
that is somehow inside a thing.
Years ago, when I was a graduate student
I attended the first Santa Fe Institute Summer School
in 1998, I used to hang out in the bar in El Farol,
a local watering hole.
I started talking with the person next to me and
he was a person whose job was to go to people's
houses and to cleanse their crystals of
harmful vibrations.
And he told me that the vibrations in crystals
have tremendous amounts of energy in them
and this energy is by nature good and positive,
but sometimes the crystals absorb negative
energy from people who visit, and so he performs
ceremonies involving dolphin energies and angelic
beings to remove the harmful vibrations from
the crystals.
So I said to him, "Hey look, you know, I'm
actually a physicist and it is true that there
is energy in the vibrations of crystals,
but that energy is very very tiny.
The amount of energy is much smaller than the
amount of energy in this piece of cheese.
Then he told me how much money he made by
invoking dolphin energies and angelic beings
and then I had to shut up because he was making
much more money than I was making as a postdoc
at Los Alamos.
But one of the most important things about
energy is we can measure it.
The big advances in the study of energy came
in the 19th century when people realized that
heat was a form of energy, heat can be transformed
into work, and you could measure the amount of
energy in heat, you could measure the amount
of energy in work, independently of whether
it was angelic or dolphin or crystal in its origin.
And the same is true of information.
So we think of information as being very important.
It's stuff that we get off of our iPhones,
it's stuff that you tell to me, that I tell
to you, it makes a difference with us, but
if we can actually try to measure information,
then we will actually have gone a long way.
And in fact, and this is well known, there is
a unit of information.
The bit.
The bit is a unit of information, stands for
binary digit, coinage by John Tukey at IBM
which was taken up by Claude Shannon who
was the founder of modern information theory.
And the idea is that a bit is a measure of
difference.
A bit measures the distinction between two
possibilities, which are famously called 0 and 1.
But it could be any two possibilities, true
and false, yes and no, hot and cold, if you
say that some things are very hot and some
things are very cold, showing that information
is often approximate. And the way that this
information processing revolution that we're
part of works is that computers, which are
devices that process information, they break
information down into its tiniest pieces,
bits, and then they flip those bits.
So, for instance, 0 goes to 1 and 1 goes to
0 is a bit flip.
Or in the case of logic, it's called a not.
So not true is false, not false is true.
So if we have this binary notion of information,
we can talk both about the fundamental unit of
information, the bit, and we can also talk
about information processing.
And at bottom, this whole information processing
revolution that we're participating in comes from
breaking information down into its tiniest
pieces, bits, and then flipping those bits in
a systematic fashion.
Indeed, if you look inside our brain, which
is processing information, our neurons are
cells which take information from many other
neurons and then decide to go from a quiescent
state, which we call 0 to an active state
that we call 1.
And then this bit of information, this 1,
this active states gets processed and
propagated electromagnetically to other neurons
which then fire on their own, and all that's going
on inside our heads is just this flipping and
processing and transmitting of bits.
And that's all that's going on inside your
PC, your Mac, your iPod, your iPhone or
whatever your information processing device
is doing.
And from this central insight, that there
is a unit of information, that we can measure
information, we get all of information theory
and all of computation.
So, let me describe how this works in a little
more detail.