1. Consider the entropy of a random variable X is 16 bits and for each value of X, a deterministic function Y(X) produces a diverse value. Then the entropy of Y is . . . . . . . . bits.
2. The Hartley shannon theorem sets a limit on the
3. Information content of a message with probability P is defined by the equation
4. A given source will have maximum entropy if all the massage produced are
5. Significance of channel coding theorem due to Shannon lies in the fact that
6. If the BW of 8 KHz white bandlimited Gaussian SNR is 25 dB the channel capacity is:
7. In order to permit the selection of 1 out of 16 equiprobable events, what is the number of bits required?
8. By properly coding and grouping longer sequence, it is possible to
9. The channel capacity is
10. A binary symmetric channel (BSC) has a transition probability of $$\frac{1}{8}.$$ If the binary transmit symbol X is such that $${{\text{P}}_{\left( {{\text{X}} = {\text{0}}} \right)}} = \frac{9}{{10}},$$ then the probability of error for an optimum receiver will be
Read More Section(Information Theory and Coding)
Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.