 # Complexity Explorer Santa Few Institute ## Introduction to Information Theory

• Measuring Information: Bits
• Using Bits to Count and Label
• Physical Forms of Information
• Fundamental Formula of Information
• Computation and Logic: Information Processing
• Mutual Information
• Communication Capacity in a Noisy Channel
• Shannon's Coding Theorem
• Homework Solutions

#### 7.1 Communication Capacity in a Noisy Channel » Quiz

Question A:

The probability that the original result is heads is 1/2 .  The probability that an error occurs during transmission is 1/8, meaning that the probability of an error not occurring is 1 - 1/8 = 7/8.

Thus, the probability that Alice’s coin lands on heads and that this gets transmitted and received correctly by Bob is 1/2 * 7/8 = 7/16.

Question B:

Let X indicate the original result (either heads or tails), and Y the result received by Bob.  We use xh and xt to indicate the “heads” and “tails” outcomes of X, and similarly for yh and yt for Y.

To compute the joint information, we must first enumerate the four possible joint outcomes of XY and their probabilities. In the answer to the last question, we already computed the probability that Alice’s coin landed on heads and was transmitted correctly, so that Bob received heads:
P(xh, yh) = 1/2 * 7/8 = 7/16
The same reasoning holds also for the probability of Alice’s coin landing on tails and Bob receiving tails,
P(xt, yt) = 1/2 * 7/8 = 7/16
To account for the two possibilities in which corruption happens, i.e., Alice’s coin lands on heads but Bob receives tails, or Alice’s coin lands on tails but Bob receives heads, we have to multiply the probability of a coin landing on heads/tails by 1/8 (the probability that an error occurs):
P(xh, yt) = 1/2 * 1/8 = 1/16
P(xt, yh) = 1/2 * 1/8 = 1/16

We now use the Fundamental Formula to compute the joint information:
I(XY)= -P(x, yh) log2 P(x , yh) -P(x, yt) log2 P(xh,yt) -P(x, yh) log2 P(xt , yh) -P(x, yt) log2 P(x, yt

≈ 1.54

Question 3

We compute the conditional information of the original result X, given the result by Bob Y, by using the following formula, I(X|Y) = I(XY) - I(Y).

In the previous question, we computed the joint information I(XY) ≈ 1.54.

Let us now compute I(Y) the marginal information in Bob’s received message. Using the rule of marginalization, and the joint probabilities we computed in the answer to the last question, we can write
P(yh) = P(xh, yh) + P(xt, yh) = 1/2 * 7/8 + 1/2 * 1/8 = 1/2
P(yt) = P(xh, yt) + P(xt, yt) = 1/2 * 1/8 + 1/2 * 7/8 = 1/2

It is easy to see that since both outcomes are equally likely, I(Y) = 1 bit.

Plugging this into the formula for conditional information gives
I(X|Y) = I(XY) - I(Y) = 1.54 - 1 = 0.54 bits.

Question 4:

We can find the mutual information between Alice and Bob by using the following formula,
I(X:Y) =  I(X) + I(Y) - I(XY)

We know that Alice throws a fair coin, so I(X) = 1 bit.  Then, using the answer to previous questions, we also know that I(Y) = 1 bit and I(XY)≈ 1.54. Combining, we get
I(X:Y) =  1 + 1 - 1.54 = 0.46 bits.