What is the entropy of a communication system that consists of six messages with probabilities $$\frac{1}{8},\,\frac{1}{8},\,\frac{1}{8},\,\frac{1}{4}$$ and $$\frac{1}{4}$$ respectively?
A. 1 bit/message
B. 2.5 bits/message
C. 3 bits/message
D. 4.5 bits/message
Answer: Option B
Related Questions on Information Theory and Coding
A. The same as FDM
B. The same as TDM
C. A combination of FDM and TDM
D. Quite different from FDM and TDM
A. C2 ≈ 2C1
B. C2 ≈ C1 + B
C. C2 ≈ C1 + 2B
D. C2 ≈ C1 + 0.3B
A. 1 and 2
B. 2 and 3
C. 1 and 3
D. None of the above
Which decoding method involves the evaluation by means of Fano Algorithm?
A. Maximum Likelihood Decoding
B. Sequential Decoding
C. Both A and B
D. None of the above

Join The Discussion