64.
Consider a binary transmission of bits 0 and 1 with equal probability. Bits are transmitted in a noisy channel and channel noise is additive in nature. The received signal is random in nature with conditional probability density function for each transmitted bit 0 and 1, respectively, as
fr/0(x) = 1 - |x||x| < 1
fr/0(x) = 1 - |x - 1| 0 < x < 2
What is the average probability of error if threshold in the decision device at the receiving end is 1?

67.
The capacity of a band-limited additive white Gaussian noise (AWGN) channel is given by $$C = W{\log _2}\left( {1 + \frac{P}{{{\sigma ^2}W}}} \right)$$     bits per second (bps), where W is the channel bandwidth, P is the average power received and σ2 is the one-sided power spectral density of the AWGN.
For a fixed $$\frac{P}{{{\sigma ^2}}} = 1000,$$   the channel capacity (in kbps) with infinite bandwidth $$\left( {W \to \infty } \right)$$  is approximately

68.
Which one of the following is correct?

70.
The entropy of a digital source is 2.7 bits/symbol. It is producing 100 different symbols per second. The source is likely to be which one of the following

Read More Section(Information Theory and Coding)

Each Section contains maximum 100 MCQs question on Information Theory and Coding. To get more questions visit other sections.