HW #2 Scenario Setup Message (X): A binary message where ‘0’ and ‘1’ are equall

HW #2
Scenario Setup
Message (X): A binary message where ‘0’ and ‘1’ are equall

HW #2
Scenario Setup
Message (X): A binary message where ‘0’ and ‘1’ are equally likely. So,
p(X=0)=0.5 and p(X=1)=0.5.
Noisy Channel:
When ‘0’ is sent, there’s an 85% chance the receiver gets ‘0’ and a 15% chance the receiver gets ‘1’.
When ‘1’ is sent, there’s a 95% chance the receiver gets ‘1’ and a 5% chance the receiver gets ‘0’.
1. Calculate the Entropy of the Source (H(X)).
2. Calculate the Entropy of the Receiver (H(Y)):.
3. Calculate the Joint Entropy (H(X,Y)).
4. Calculate the Mutual Information (I(X;Y))
Huffman Codes
5. Calculate the entropy of the phrase “bobby the peewee beekeeper”
6. Compute a Huffman encoding of the phrase. Show the entire tree.
7. Compute the avg number of bits per symbol for this encoding and compare it to the phrase entropy.
the example is atta