11.
A discrete memory less source has two symbols x1 and x2 with probabilities p(x1) and p(x2) respectively, then the entropy of the source H(X) is

12.
Capacity of each channel in FDMA is given by

15.
Match List-I with List-II and select the correct answer using the options given below the lists:
List-I List-II
a. Entropy coding 1. McMillan's Rule
b. Channel capacity 2. Redundancy
c. Minimum length code 3. Shannon Fano
d. Equivocation 4. Shannon law

17.
Consider a noisy binary channel with bit error probability pe = 5 × 10-5. Assume 10000 bits are being transmitted over such a channel. The probability that there will be 2 or less number of bits in error is

18.
A communication channel with AWGN operating at a signal to noise ratio SNR >> 1 and bandwidth B has capacity C1. If the SNR is doubled keeping B constant, the resulting capacity C2 is given by

19.
An event has two possible outcomes with probability $${P_1} = \frac{1}{2},\,{P_2} = \frac{1}{{64}}.$$    The rate of information with 16 outcomes per second is:

20.
An analog signal is band limited to 4 kHz. It is sampled at the Nyquist rate and samples are quantized into 4 levels. The quantization levels have probabilities $$\frac{1}{8},\,\frac{1}{8},\,\frac{3}{8}$$  and $$\frac{3}{8}.$$ The information rate of the source is:

Read More Section(Information Theory and Coding)

Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.