How Does Channel Capacity's relation to Mutual Information makes sense Intuitively?

Thread Starter

Vikram50517

Joined Jan 4, 2020
81
hello y'all. as in the title iam confused about how mutual information intuitively relates to channel capacity. Take an example ,Let us have three random bits, X, Y, Z. X and Y are independent and unbiased, while Z is the exclusive or of X and Y. Then H(X,Y)=2, while H(X,Y|Z=z)=1 for any z∈{0,1} because knowing Z and the value of X determines Y. Then I(X,Y;Z)=H(X,Y)−H(X,Y|Z)=1. In words, if you know Z you only really need to encode X rather than both X and Y. So basically in send complete information about the system (X,Y) , if i know Z i should only encode H(X,Y|Z) and that's enough! So I save he trouble in encoding 1 bit or I(X;Y) bits to represent the entire (X,Y) system. Now how does Channel capacity come into the picture?
 
Last edited:

drjohsmith

Joined Dec 13, 2021
852
hello y'all. as in the title iam confused about how mutual information intuitively relates to channel capacity. Take an example , Let us have three random bits, X, Y, Z. X and Y are independent and unbiased, while Z is the exclusive or of X and Y. Then H(X,Y)=2, while H(X,Y|Z=z)=1 for any z∈{0,1} because knowing Z and the value of X determines Y. Then I(X,Y;Z)=H(X,Y)−H(X,Y|Z)=1. In words, if you know Z you only really need to encode XX rather than both X and Y. So basically in send complete information about the system (X,Y) , if i know Z i should only encode H(X,Y|Z) and that's enough! So I save he trouble in encoding 1 bit or I(X;Y) bits to represent the entire (X,Y) system. Now how does Channel capacity come into the picture?
This sounds very familiar
have you posted here before, may be under another name ?
 
Top