1 | Discuss Shannon’s channel capacity theorem. Discuss channel capacity for infinite bandwidth. Show that channel capacity is always finite for finite signal and noise power. |

2 | Define the entropy of a discrete memoryless source emitting M symbols and discuss the properties of entropy. A zero memory source emits messages m1 and m2 with probabilities 0.8 and 0.2, respectively. Find the optimum binary compact code for this source and its second order extension. Determine code efficiencies in each case |

3 | Explain Mutual information in detail. State its properties. |

4 | Explain delta modulation in detail. Also discuss advantages and disadvantages of delta modulation |

5 | A binary channel matrix is given by y1 y2 x1 2/3 1/3 x1,x2=input, y1,y2=output x2 1/3 2/3 Px(x1)=1/2 and Px(x2)=1/2. Determine H(X), H(Y), H(X/Y), H(Y/X) and I (X;Y). |

6 | “The power spectral density and the correlation function of a periodic waveform are a Fourier transform pair” Justify. |

7 | Derive the equation for channel capacity of BSC channel. |

8 | Write shortnote on Optimum binary receiver. |

9 | Find the channel capacity of the Binary-Symmetric Channel (BSC). |

10 | A zero-memory source emits messages m1 and m2 with probabilities 0.8 and 0.2, respectively. Find the optimum (Huffman) binary code for this source as well as for its second – and third – order extension (that is, for N = 2 and 3). Determine the code efficiencies in each code. |

11 | Derive channel capacity C if channel noise is additive, white Gaussian with mean square value N, given signal power S. *********************************************************************** |

Advertisements