Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
4 views1 page

Tutorial 3

The document is a tutorial on information theory covering key concepts such as joint entropy, mutual information, and channel capacity. It includes problems related to additive white Gaussian noise channels, conditional entropy, and specific channel matrices. The tutorial aims to provide a deeper understanding of digital communications through practical exercises.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views1 page

Tutorial 3

The document is a tutorial on information theory covering key concepts such as joint entropy, mutual information, and channel capacity. It includes problems related to additive white Gaussian noise channels, conditional entropy, and specific channel matrices. The tutorial aims to provide a deeper understanding of digital communications through practical exercises.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Digital Communications

Information theory part 2 – Tutorial

1. Show that H(X;Y)=H(X)+H(Y|X) = H(Y)+H(X|Y)

2. Prove I ( X ;Y )  0

3. An additive white Gaussian Noise (AWGN) channel has output Y = X + N , where X is


the input, and N is the noise with pdf
1  n2 
f n ( n) = exp − 
2 
2 n
2
 2 n 

If X is a white Gaussian input with E[X]=0, and E[ X 2 ] =  X2 . Determine the conditional


entropy H ( X / N ).

4. Determine the average mutual information I ( X ;Y ) for the Gaussian channel defined in
Problem 1.

5. A channel has two inputs (0,1) and three outputs(0,e,1) where “e” indicates an erasure;
that is, there is no output for the corresponding input. The channel matrix is

 p 1− p 0 
 0 1 − p p
 
Compute the channel capacity.

6. Compute the channel capacity for symmetric binary transmission shown below.
1− q
x0 y0
q

x1 y1
1− q

You might also like