9 Points [Entropy] (a) (3 points) Show that, for a source generating N symbols,
ID: 3721676 • Letter: 9
Question
9 Points [Entropy]
(a) (3 points) Show that, for a source generating N symbols, the entropy is maximum when the symbols are all equiprobable.
(b) (5 points) Suppose we receive the sequence of symbols 4 5 8 6 4 7 8 9 4 8.
i. If we know that the probabilities of receiving such symbols are: P4 = P5 = P6 = P7 = P8 = P9 = 1 6, then what is the entropy of symbols under this probability model?
ii. Now assume that the probability model is given by the frequency of occurrence of the symbols that were actually received. Now what is the entropy? Explain the di?erence.
iii. Calculate how many bits in total and how many bits per symbol are needed to transmit symbols in the probability model stated above in part (ii) if the actual symbol stream is: 4 4 4. Explain any discrepancy from the entropy.
(c) (1 point) What is the lower limit, in bits per pixel, for the Hu?man coder? Explain brie?y.
Explanation / Answer
Solution:
Note:The first question is done as per Chegg guidelines, please repost others.
a)
Because in that case we will be left with the almost complete binary tree while building Huffman tree and the nodes of the symbols will all be the leaf node will be present at the last level of the tree, hence the coding length of each letter will be equal, and it will be maximum.
This proves that the entropy of the coding will increase in this case.
I hope this helps if you find any problem. Please comment below. Don't forget to give a thumbs up if you liked it. :)
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.