52.
We wish to transmit a digital voice at 32 kbps through a communication channel having bandwidth of 3000 Hz. The received signal to noise power ratio is 30 dB. State which of the following statement is true:

54.
Source S1 produces 4 discrete symbols with equal probability.
Source S2 produces 6 discrete symbols with equal probability.
If H1 and H2 are the entropies of sources S1 and S2 respectively, then which one of the following is correct?

55.
In communication system, if for a given rate of information transmission requires channel bandwidth B1 and signal-to-noise ratio SNR1. If the channel bandwidth is doubled for same rate of information, then the new signal-to noise ratio will be

56.
The probabilities of the five possible outcomes of an experiment are given as: $$\frac{1}{2},\,\frac{1}{4},\,\frac{1}{8},\,\frac{1}{{16}},\,\frac{1}{{16}}$$
If the source produces 16 symbols per second, then the rate of information would be equal to

57.
A discrete zero memory information source has 40 symbols and each symbol is equally likely. The minimum number of bits required to code the source with uniform length code and the entropy of the source are respectively.

Read More Section(Information Theory and Coding)

Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.