Consider a source with four symbols. The entropy of the source will be maximum when probabilities of occurrence of symbols are:
A. $$\left\{ {1,\,0,\,0,\,0} \right\}$$
B. $$\left\{ {\frac{1}{4},\,\frac{1}{4},\,\frac{1}{4},\,\frac{1}{4}} \right\}$$
C. $$\left\{ {\frac{1}{2},\,\frac{1}{4},\,\frac{1}{8},\,\frac{1}{8}} \right\}$$
D. $$\left\{ {\frac{1}{3},\,\frac{1}{3},\,\frac{1}{3},\,0} \right\}$$
Answer: Option B
Related Questions on Information Theory and Coding
A. The same as FDM
B. The same as TDM
C. A combination of FDM and TDM
D. Quite different from FDM and TDM
A. C2 ≈ 2C1
B. C2 ≈ C1 + B
C. C2 ≈ C1 + 2B
D. C2 ≈ C1 + 0.3B
A. 1 and 2
B. 2 and 3
C. 1 and 3
D. None of the above
Which decoding method involves the evaluation by means of Fano Algorithm?
A. Maximum Likelihood Decoding
B. Sequential Decoding
C. Both A and B
D. None of the above

Join The Discussion