31.
A discrete source emits four symbols with probabilities $$\frac{1}{3},\,\frac{1}{3},\,\frac{1}{4}$$   and $$\frac{1}{{12}}$$ every 100 µs. The information rate is:

32.
The channel capacity of an ideal AWGN channel with infinite bandwidth is approximated as (where S is average signal power and $$\frac{\eta }{2}$$ is the double sided power spectral density of white Gaussian noise)

36.
What is the entropy of a communication system that consists of six messages with probabilities $$\frac{1}{8},\,\frac{1}{8},\,\frac{1}{8},\,\frac{1}{4}$$   and $$\frac{1}{4}$$ respectively?

37.
Which one of the following statements is correct?
The Shannon's Theorem, channel capacity formula indicates that in theory

39.
A zero memory source emits six messages with probabilities 0.3, 0.25, 0.15, 0.12, 0.1 and 0.08. If binary Huffman coding is used, what will be the average code length?

40.
Block codes are . . . . . . . . codes that enable a limited number of errors to be detected and corrected without retransmission

Read More Section(Information Theory and Coding)

Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.