Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
12 views5 pages

Tutorial6 Sol

Uploaded by

istiaq8888
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views5 pages

Tutorial6 Sol

Uploaded by

istiaq8888
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 5

ETM2046 Analog & Digital Communications

Tutorial 6 (Solutions)
1.

2. (a) bits per element

Average capacity = H  number of elements per second


= 4  32  2  106 = 2.56  108 bits/s

(b) bits per element

Average capacity = H  number of elements per second


= 10  32  2  106 = 6.4  108 bits/s
Capacity needs to be increased by a factor of

3. (a) Message i = bits


Information content in each of s0 and s1 = bits
Information content in each of s2, s3 and s4 = bits
Information content in each of s5 and s6 = bits

(b)
0
s0 0.25 0.25 0.25 0.25 0.5 0.5
0 1
s1 0.25 0.25 0.25 0.25 0.25 0.5
0 1
s2 0.125 0.125 0.25 0.25 0.25
0 1
s3 0.125 0.125 0.125 0.25
0 1
s4 0.125 0.125 0.125
0 1
s5 0.0625 0.125
1
s6 0.0625

Huffman codewords

1
ETM2046 Analog & Digital Communications

s0 10
s1 11
s2 001
s3 010
s4 011
s5 0000
s6 0001

(c)
s0 s1 s2 s3 s4 s5 s6
1
0
s2 s3 s4 s5 s6
s0 s1
0 1
0 1

s2 s3 s4 s5 s6
s0 s1
0 1
0 1

s2 s3
s5 s6
s4

0 1

s5 s6

Shannon-Fano codewords
s0 00

2
ETM2046 Analog & Digital Communications

s1 01
s2 100
s3 101
s4 110
s5 1110
s6 1111

Average word length = 2.625, source entropy = 2.625


Efficiency of the Shannon-Fano code = 1

4. For the original codewords,

For Shannon-Fano coding,

ABCD
0 1

BCD
A
0 1

B CD

0 1

D
C

Shannon-Fano codewords
A 0
B 10
C 110
D 111

3
ETM2046 Analog & Digital Communications

For Huffman coding,

0
A 0.5 0.5 0.5
0 1
B 0.25 0.25 0.5
0 1
C 0.125 0.25
1
D 0.125

Huffman codewords
A 1
B 01
C 000
D 001

Average word length = 1.75, source entropy = 1.75


Efficiency of the Huffman code = 1

5. 1 0 1 1 0 1  0 0 1 1 0 0 = 1 0 0 0 0 1
Hamming distance = weight of 1 0 0 0 0 1 = 2

6. (a) The codeword v can be determined as follows:


uG = v where u is information vector.

= [(0  0  0  0) (0  0  0  0) (0  0  1  0) (0  0  0  1)
(0  0  1  0) (0  0  0  1) (0  0  1  1)]
= 0 0 1 1 1 1 0  encoded codeword

(b) The syndrome can be determined as follows:


Hr

4
ETM2046 Analog & Digital Communications

Syndrome is the same as the 3rd column (from left) of check matrix.
Bit 3 (from left) is in error. The corrected codeword is 1010010.

7. For Hamming (7, 4) code


Efficiency = Redundancy =
For Hamming (15, 11) code
Efficiency = Redundancy =
Hamming (15,11) code is more efficient but since it has lower redundancy, it has poorer error
correcting capability and hence higher error probability.

You might also like