Digital Communications
Information theory part 2 – Tutorial
1. Show that H(X;Y)=H(X)+H(Y|X) = H(Y)+H(X|Y)
2. Prove I ( X ;Y ) 0
3. An additive white Gaussian Noise (AWGN) channel has output Y = X + N , where X is
the input, and N is the noise with pdf
1 n2
f n ( n) = exp −
2
2 n
2
2 n
If X is a white Gaussian input with E[X]=0, and E[ X 2 ] = X2 . Determine the conditional
entropy H ( X / N ).
4. Determine the average mutual information I ( X ;Y ) for the Gaussian channel defined in
Problem 1.
5. A channel has two inputs (0,1) and three outputs(0,e,1) where “e” indicates an erasure;
that is, there is no output for the corresponding input. The channel matrix is
p 1− p 0
0 1 − p p
Compute the channel capacity.
6. Compute the channel capacity for symmetric binary transmission shown below.
1− q
x0 y0
q
x1 y1
1− q