1.
Consider the entropy of a random variable X is 16 bits and for each value of X, a deterministic function Y(X) produces a diverse value. Then the entropy of Y is . . . . . . . . bits.

2.
The Hartley shannon theorem sets a limit on the

3.
Information content of a message with probability P is defined by the equation

9.
The channel capacity is

10.
A binary symmetric channel (BSC) has a transition probability of $$\frac{1}{8}.$$ If the binary transmit symbol X is such that $${{\text{P}}_{\left( {{\text{X}} = {\text{0}}} \right)}} = \frac{9}{{10}},$$   then the probability of error for an optimum receiver will be

Read More Section(Information Theory and Coding)

Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.