71. The minimum distance of a (n, k) = (7, 4) linear block code is upper bounded by:
72. For a discrete memory less source containing K symbols, the upper bound for the entropy is
73. Channel capacity is basically a measure of:
74. What is the capacity of an additive white Gaussian noise channel with bandwidth of 1 MHz, power of 10 W and noise power spectral density of No/2 = 10(-9) W/Hz?
75. A zero source generates two messages with probability 0.8 and 0.2. These are coded as 1 and 0.2. The code efficiency is
76. For Gaussian and White channel noise, the capacity of a low-pass channel with a usable bandwidth of 3000 Hz and S/N = 103 at the channel output will be
77. The entropy for a fair coin toss is exactly
78. Given a channel with an intended capacity of 20 Mbits. The Bandwidth of this channel is 3 MHz. What is S/N ratio required in order to achieve this capacity?
79. A 8 kHz communication channel has an SNR of 30 dB. It the channel bandwidth is doubled, keeping the signal power constant, the SNR for the modified channel will be
80. Which of the following channel coding scheme helps in correcting burst errors?
Read More Section(Information Theory and Coding)
Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.