81. In a communication system each message (0 or 1) is transmitted three times in order to reduce the probability of error. The detection is based on the majority rule at the receiver. If pe is the probability of bit error, the probability of error for this communication system is
82. Consider a source with four symbols. The entropy of the source will be maximum when probabilities of occurrence of symbols are:
83. A source generates one of the five symbols s1, s2, s3, s4 and s5 once in every $$\frac{1}{{60}}$$ second. The symbols are assumed to be independent and occur with probabilities $$\frac{1}{4},\,\frac{1}{4},\,\frac{1}{4},\,\frac{1}{8}\& \frac{1}{8}.$$ The average information rate of the source in bits/second is:
84. The relation between entropy and mutual information is
85. Blocking probability of a 3 stage switch in term of inlet utilization P
86. A communication channel with Additive white Gaussian Noise, has a bandwidth of 4 kHz and SNR of 31 dB. Its channel capacity is
87. Discrete source S1 has 4 equiprobable symbols while discrete source S2 has 16 equiprobable symbols. When the entropy of these two sources is compared, entropy of
88. Information is
89. Which statement is not correct?
90. A communication channel has a bandwidth of 100 MHz. The channel is extremely noisy such that the signal power is very much below the noise power. What is the capacity of this channel?
Read More Section(Information Theory and Coding)
Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.