31. A discrete source emits four symbols with probabilities $$\frac{1}{3},\,\frac{1}{3},\,\frac{1}{4}$$ and $$\frac{1}{{12}}$$ every 100 µs. The information rate is:
32. The channel capacity of an ideal AWGN channel with infinite bandwidth is approximated as (where S is average signal power and $$\frac{\eta }{2}$$ is the double sided power spectral density of white Gaussian noise)
33. If the channel is band limited to 6 kHz and signal to noise ratio is 16, what would be the capacity of channel?
34. In Huffman coding, data in a tree always occur at-
35. In information theory, the entropy is
36. What is the entropy of a communication system that consists of six messages with probabilities $$\frac{1}{8},\,\frac{1}{8},\,\frac{1}{8},\,\frac{1}{4}$$ and $$\frac{1}{4}$$ respectively?
37. Which one of the following statements is correct?
The Shannon's Theorem, channel capacity formula indicates that in theory
The Shannon's Theorem, channel capacity formula indicates that in theory
38. An 8 level encoding scheme is used in a PCM system of 10 kHz channel BW. The channel capacity is
39. A zero memory source emits six messages with probabilities 0.3, 0.25, 0.15, 0.12, 0.1 and 0.08. If binary Huffman coding is used, what will be the average code length?
40. Block codes are . . . . . . . . codes that enable a limited number of errors to be detected and corrected without retransmission
Read More Section(Information Theory and Coding)
Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.