Monday, May 20, 2019

Communicatoin theory

Explain what is meant y narrowband FM and wideband FM using the human face? 3. Explain any two techniques of demodulation of FM. 4. Explain the working of reactance tube modulator and derive an expression to show how the variation of the amplitude of the input signal changes the absolute frequency of the output signal of the modulator. 5. Draw the frequency spectrum of FM and exempt. Explain how Varactor diode can be used for frequency modulation. 6. Discuss the indirect regularity of generating a wide-band FM signal. 7. Draw the circuit diagram of Foster-Seelay discriminator and explain its working.Define internal dissonance. 2. Define mutable racquet. 3. Define thermal noise. 4. Define narrow band noise. 5. Define noise fgure. 6. Define noise akin bandwidth. 7. Define a random multivariate. Specify the sample space and the random variable for a walk out tossing experiment. 8. What is blanched noise? Give its characteristics. 9. When is a random process called determinist ic? 10. Define flicker noise. 11 . State the reasons for higher noise in mixers. 1 . Derive the effective noise temperature of a come down amplifier and explain how various noises are generated in the method of representing them. Explain the following impairment (i) Random variable (it) Random process (iii)Gaussian process 3. Explain how various noises are generated and the method of representing them. 4. keep notes on noise temperature and noise figure. 5. Derive the noise figure for cascade stages. 6. What is narrowband noise? Discuss the properties of the quadrature components of a narrowband noise? 7. Write short notes on thermal noise and short noise. 8. Explain in detail about white and filtered noise.Define image frequency. 3. Define Tracking 4. What is meant by FOM of a receiver? 5. What is threshold effect? 6. Draw the Phasor mental representation of FM noise. 8. Define SNR. 9. What is the SNR at the output of DSB system with coherent demodulation? 10. Define CSNR. 1 1 . What is sensibility and selectivity of receiver? 1 . Explain the working of Super heterodyne receiver with its parameters. 2. Discuss the noise functioning of AM system using envelope detection. 3. Compare the noise performance of AM and FM systems. 4. seem the noise power of a DSB-SC system using coherent detection. Discuss in detail the noise performance in SSB-SC receiver. 6. Explain the significance of pre-emphasis and de-emphasis in FM system. 7. Derive the noise power spiritual density of the FM demodulation and explain its performance with diagram. 8. a. Draw the block diagram of FM demodulator and explain the effect of noise in detail. b. Explain the FM threshold effect and capture effect in FM. UNIT V INFORMATION THEORY 1. What is prefix code? 2. Define entropy rate. 3. What is channel efficiency of double star synchronous channel with error probability of 4.State channel coding theorem. . Define entropy for a discrete memory less source. 6. What is code redundancy? 7. Write down the formula for the mutual information. 8. forebode the source coding techniques. 9. What is Data compaction? 10. Write the expression for code efficiency in terms of entropy. PART-B (16 Marks) 1 . Explain the significance of the entropy H(XN) of a communication system where X is the transmitter and Y is the receiver. 2. An event has six possible outcomes with probabilities Find the entropy of the system. 3.Discuss Source coding theorem, give the advantage and evil of channel oding in detail, and discuss the data compaction. 4. Explain the properties of entropy and with suitable example, explain the entropy of binary memory less source. 5. Five symbols of the alphabet of discrete memory less source and their probabilities are presumptuousness below. the symbols using Huffman coding. 6. Write short notes on Differential entropy, derive the channel content theorem and discuss the implications of the information capacity theorem. 7. What do you mean by binary symmetr ic channel? Derive channel capacity formula for symmetric channel. . Construct binary optical code for the following probability symbols using Huffman role and calculate entropy of the source, average code Length, efficiency, redundancy and variance?

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.