4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix

Size: px
Start display at page:

Download "4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix"

Transcription

1 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free ONLINE EXMINTIONS [Mid 2 - dc] 1. Two binary random variables X and Y are distributed according to the joint Distribution given as P(X=Y=0) = P(X=Y=1) = P(X=Y=1) = 1/.Then, [01D01] H(X) = H(Y) H(X) = 2.H(Y) H(Y) = 2.H(X) H(X) + H(Y) = n independent discrete source transmits letters from an alphabet consisting of and B with respective probabilities 0.6 and 0.4.If the consecutive letters are statistically independent, and two symbol words are transmitted, then the probability of the words with different symbols is [01D02] memoryless source emits 2000 binary symbols/sec and each symbol has a probability of 0.25 to be equal to 1 and 0.75 to be equal to 0.The minimum number of bits/sec required for error free transmission of this source in bits/symbol is [01M01] Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix where xi`s are transmitted messages and yj`s are received messages is [01M0] log bits log5 bits log4 bits 1bit 6. Information rate of a source is [01S01] the entropy of the source measured in bits/message the entropy of the source measured in bits/se a measure of the uncertainity of the communication system maximum when the source is continuous 7. If `a` is an element of a Field `F`, then its additive inverse is [01S02] 0 -a a 1 8. The minimum number of elements that a field can have is [01S0] Which of the following is correct? [01S04] The syndrome of a received Block coded word depends on the transmitted code wor The syndrome of a received Block coded word depends on the received code word The syndrome of a received Block coded word depends on the error pattern The syndrome for a received Block coded word under error free reception consists of all 1`s.

2 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 10. If the output of a continuous source is limited to an average power of σ 2, then the Maximum entropy of the source is [01S05] 11. Convolutional encoder of code rate 1/2 consists of two stage shift register.the Generator sequence of top adder is (1,1,1) and that of the bottom adder is (1,0,1). The constraint length of the encoder is [02D01] The Parity Check Matrix of a (6,) Systematic Linear Block code is If the Syndrome vector computed for the received code word is [ 1 1 0], then for error correction, which of the bits of the received code word is to be complemented? [02D02] The minimum number of bits per message required to encode the output of source transmitting four different messages with probabilities 0.5,0.25,0.125 and is [02M01] Communication channel is represented by the channel Matrix given as In the above matrix, rows correspond to the Transmitter X and the columns correspond to the Receiver Y. Then, the Conditional entropy H(Y/X) in bits/message is [02M02] zero log 5 log The Channel Matrix of a Noiseless channel [02M0] consists of a single nonzero number in each column consists of a single nonzero number in each row is an Identity Matrix is a square Matrix Enropy of a source is [02S01] verage amount of information conveyed by the communication system verage amount of information transferred by the channel verage amount of information available with the source verage amount of information conveyed by the source to the receiver 17. Relative to Hard decision decoding, soft decision decoding results in [02S02] better bit error probability better coding gain less circuit complexity lesser coding gain Which of the following is the essential requirement of a source coding scheme? [02S0] Comma free nature of the code words Minimum Hamming distance of Error detection and correction capability The received code word should compatible with a Matched filter. The transition probabilities for a BSC will be represented using [02S04] Joint Probability Matrix State diagram Conditional Probability Matrix Trellis diagram

3 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 20. Field is [02S05] an belian group under addition a group with 0 as the multiplicative identity for its members a group with 1 as the additive identity for its members a group with 0 as the additive inverse for its members 21. The constraint length of a convolutional encoder of code rate 1/ is 5. If the input of the encoder is a 5 bit message sequence, the length of the out put code word in bits is [0D01] communication channel is represented by its channel matrix with rows representing the messages associated with the source and the columns representing the messages associated with the receiver given as 2. Its capacity in bits is [0D02] log 4 log log 12 log 7 Binary Erasure channel has P(0/0) = P(1/1) = p; P(k/0) = P(k/1) = q. Its Capacity in bits/symbol is [0M01] p q pq p/q 24. When a pair of dice is thrown, the average amount of information contained in the message " The sum of the faces is 7" in bits is [0M02] source emits messages and B with probability 0.8 and 0.2 respectively. The redundancy provided by the optimum source coding scheme for the above Source is [0M0] 72 % 27 % 45 % 55 % Information content of a message [0S01] increases with its certainty of occurrence independent of the certainty of occurrence increases with its uncertainty of occurrence is the logarithm of its certainty of occurrence.under error free reception, the syndrome vector computed for the received cyclic code word consists of [0S02] all ones alternate 1`s and 0`s starting with a 1 alternate 0`s and 1`s starting with a 0 all zeros continuous source will have maximum entropy associated if the pdf associated with its output is [0S0] Poisson Exponential Rayleigh Gaussian Variable length source coding provides better coding efficiency, if all the messages of the source are [0S04] Equiprobable with different transmission probability discretely transmitted continuously transmitted Shanon's Limit deals with [0S05] maximum information content of a message maximum entropy associated with a source maximum capacity of a channel maximum bit rate of a source

4 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 1. The Parity Check Matrix of a (6,) Systematic Linear Block code is If the Syndrome vector computed for the received code word is [ 0 1 1], then for error correction, which of the bits of the received code word is to be complemented? [04D01] (7,4) Cyclic code has a generator polynomial given as 1+x+x.If the error pattern , the corresponding syndrome vector is [04D02] The Memory length of a convolutional encoder is.if a 5 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [04M01] The code words of a systematic (6,) Linear Block code are ,010011,011101, ,101011,110110, Which of the following is also a code word of the c Code? [04M02] The syndrome S(x) of a cyclic code is given by Reminder of the division, where V(x) is the transmitted code polynomial, E(x) is the error polynomial and g(x) is the generator polynomial. The S(x) is also equal to [04M0] Reminder of V(x) /g(x) Reminder of E(x)/g(x) Reminder of [V(x). E(x)]/g(x) Remainder of g(x)/v(x) Source1 is transmitting two messages with probabilities 0.2 and 0.8 and Source 2 is transmitting two messages with probabilities 0.5 and 0.5.Then [04S01] Maximum uncertainty is associated with Source 1 Maximum uncertainty is associated with Source 2 Both the sources 1 and 2 are having maximum amount of uncertainty associated There is no uncertainty associated with either of the two sources. source X and the receiver Y are connected by a noise free channel. Its capacity is [04S02] Max H(X) Max H(X/Y) Max H(Y/X) Max H(X,Y) 8. The entropy measure of a continuous source is a [04S0] Relative measure bsolute measure Linear Measure Non-Linear Measure 9. Which of the following is correct? [04S04] FEC is used for error control after receiver makes a decision about the received bit RQ is used for error control after receiver makes a decision about the received bit FEC is used for error control when the receiver is unable to make a decision about the received bit FEC and RQ are not used for error correction Error free communication may be possible by [04S05] reducing redundancy during transmission increasing transmission power to the required level providing redundancy during transmission increasing the channel band width For the data word 1010 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x+x, the code polynomial is [05D01] 5 1+x+x +x 4 1+x+x +x 1+x 2 +x 4 +x x+x +x

5 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 42. The output of a continuous source is a uniform random variable in the range.the entropy of the source in bits/sample is [05D02] For the source X transmitting four messages with probabilities 1/2, 1/4, 1/8 and 1/8, Maximum coding efficiency can be obtained by using [05M01] Convolutional codes Only Shanon-Fano method Either of the Shanon- fano and Huffman methods Block codes 44. In Modulo-7 addition, 6+1 is equal to [05M02] source is transmitting two messages and B with probabilities /4, and 1/4 respectively. The coding efficiency of the first order extension of the source is [05M0] 89 % 77 % 92 % 81 % 46. The noise characteristic of a communication channel is given as Rows represent the source and columns represent the columns. The Channel is a [05M04] Noise free channel symmetric channel Symmetric channel Deterministic channel 47. The source coding efficiency can be increased by [05S01] using source extension increasing the entropy of the source decreasing the entropy of the source using binary coding The capacity of a channel with infinite band width is [05S02] infinite because of infinite band width finite because of increase in noise power infinite because of infinite noise power finite because of finite message word length The Hamming Weight of the (6,) Linear Block coded word [05S0] The cascade of two Binary Symmetric Channels is a [05S04] symmetric Binary channel asymmetric quaternary channel symmetric quaternary channel asymmetric Binary channel 51. In a (6,) systematic Linear Block code, the number of `6` bit code words that are not useful is [06D01] The Parity check Matrix H of a (6,) Linear systematic Block code is Then [06D02] T C.H = [1], where C is the code word of the code. if the syndrome vector S computed for the received code word is [1 1 0], the third bit of the received code word is in error. T The syndrome vector S of the received code word is same as C.H The syndrome vector is S= [1 1 1] under error free reception 5. In a Binary Symmetric channel, a transmitted 0 is received as 0 with a probability of 1/8.Then, the transition probability of the transmitted 0 is [06M01]

6 1/8 7/8 6/8 5/8 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free source transmitting `n` number of messages is connected to a noise free channel. The capacity of the channel is [06M02] n bits/symbol 2 n bits/symbol log n bits/symbol 2n bits/symbol There are four binary words given as 0000,0001,0011,0111. Which of these can not be a member of the parity check matrix of a (15,11) linear Block code? [06M0] 0000, If X is the transmitter and Y is the receiver and if the channel is the noise free, then, the mutual information I(X,Y) is equal to [06S01] Joint entropy of the source and receiver Entropy of the source Conditional Entropy of the receiver, given the source Conditional Entropy of the source, given the receiver Which of the following is correct? [06S02] Source coding introduces redundancy Channel coding is an efficient way of representing the output of a source RQ scheme of error control is applied after the receiver makes a decision about the received bit RQ scheme of error control is applied when the receiver is unable to make a decision about the received bit. Which of the following is an FEC scheme? [06S0] Shanon-Fano encoding Huffman encoding Non-systematic cyclic codes Duo-binary encoding 59. discrete source X is transmitting m messages and is connected to the receiver Y through a symmetric channel. The capacity of the channel is given as [06S04] log m - H(X/Y) bits/symbol log m bits/symbol log m - H(Y/X) bits/symbol H(X) + H(Y) - H(X,Y) bits/symbol 60. If the received code word of a (6,) linear Block code is with an error in the bit, the corresponding error pattern will be [06S05] For the data word 1110 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x+x, the code polynomial is [07D01] 5 1+x+x +x x +x +x 1+x 2 +x 4 +x x +x 62. The output of a source is band limited to 6KHz.It is sampled at a rate of 2KHz above the nyquist`s rate. If the entropy of the source is 2bits/sample, then the entropy of the source in bits/sec is [07D02] 24Kbps 28Kbps 12Kbps 2Kbps 6. When two fair dice are thrown simultaneously, the information content of the message ` the sum of the faces is 12` in bits is [07M01] The encoder of a (7,4) systematic cyclic encoder with generating polynomial g(x) = 1+x +x 4 stage shift register stage shift register 11 stage shift register is basically a [07M02]

7 22 stage shift register Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 65. received code word of a (7,4) systematic received cyclic code word is corrected as The corresponding error pattern is [07M0] The product of 5 and 6 in Modulo-7 multiplication is [07S01] Which of the following can be the generating polynomial for a (7,4) systematic Cyclic code? [07S02] x +x+1 4 x +x x +x +1 +x x +x The time domain behavior of a convolutional encoder of code rate 1/ is defined in terms of a set of [07S0] step responses impulse responses ramp responses sinusoidal responses Which of the following is correct? [07S04] In an (n,k) block code, each code word is the cyclic shift of an another codeword of the code. In an (n,k) systematic cyclic code, the sum of two code words is another code word of the code. In a convolutional encoder, the constraint length of the encoder is equal to the tail of the message sequence + 1. Source encoding reduces the probability of transmission errors linear block code with Hamming distance 5 is [07S05] Single error correcting and double error detecting code double error detecting code Triple error correcting code Double error correcting code 71. The channel capacity of a BSC with transition probability 1/2 is [08D01] 0 bits 1bit 2 bits infinity 72. White noise of PSD w/hz is applied to an ideal LPF with one sided band width of 1Hz.The two sided output noise power of the channel is [08D02] four times the input PSD thrice the input PSD twice the input PSD same as the input PSD convolutional encoder of code rate 1/2 is a stage shift register with a message word length of 6.The code word length obtained from the encoder ( in bits) is [08M01] source X with entropy 2 bits/message is connected to the receiver Y through a Noise free channel. The conditional probability of the source, given the receiver is H(X/Y) and the joint entropy of the source and the receiver H(X,Y).Then [08M02] H(X,Y) = 2 bits/message H(X/Y) = 2 bits/message H(X,Y) = 0 bits/message H(X/Y) = 1 bit/message channel with independent input and output acts as [08M0] lossless network resistive network channel with maximum capacity Gaussian channel 76. utomatic Repeat Request is a [08S01] Source coding scheme error correction scheme error control scheme data conversion scheme

8 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 77. Channel coding [08S02] avoids redundancy reduces transmission efficiency increases signal power level relative to channel noise results in reduced transmission band width requirement The information content available with a source is referred to as [08S0] Mutual information Trans information capacity entropy In In a Linear Block code [08S04] the received power varies linearly with that of the transmitted power the encoder satisfies super position principle parity bits of the code word are the linear combination of the message bits the communication channel is a linear system Modulo-4 arithmetic, the product of and 2 is [08S05] For the data word 1110 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x +x, the code polynomial is [09D01] 5 1+x+x +x x +x +x 1+x 2 +x 4 +x 5 1+x+x 82. In a (7,4) systematic Linear Block code, the number of `7` bit code words that are not useful for the user is [09D02] Which of the following is a valid source coding scheme for a source transmitting four messages? [09M01] 0,00,001,110 1,11,111,1110 0,10,110,111 1,01,001, system has a band width of 4KHz and an S/N ratio of 28 at the input to the Receiver. If the band width of the channel is doubled, then [09M02] Capacity of the channel gets doubled Capacity of the channel gets squared S/N ratio at the input of the received gets halved S/N ratio at the input of the received gets doubled The Memory length of a convolutional encoder is 4.If a 5 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [09M0] In Modulo-5 multiplication, the product of 4 and is [09S01] Which of the following provides minimum redundancy in coding? [09S02] (15,11) linear block code Shanon-Fano encoding (6,) systematic cyclic code Convolutional code 88. If C is the channel capacity and S is the signal input of the channel and following is the Shannon`s limit? [09S0] is the Input noise PSD, then which of the

9 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 89. communication channel is fed with an input signal x(t) and the noise in the channel is negative. The Power received at the receiver input is [09S04] Signal power - Noise power Signal power/ Noise power Signal power + Noise Power Signal Power x Noise Power The fundamental limit on the average number of bits/source symbol is [09S05] Information content of the message Entropy of the source Mutual Information Channel capacity The Parity Check Matrix of a (6,) Systematic Linear Block code is If the Syndrome vector computed for the received code word is [ 0 1 0], then for error correction, which of the bits of the received code word is to be complemented? [10D01] White noise of PSD is applied to an ideal LPF with one sided band width of B Hz. The filter provides a gain of 2.If the output power of the filter is 8η, then the value of B in Hz is [10D02] The Memory length of a convolutional encoder is 5.If a 6 bit message sequence is applied as the input for the encoder, then for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is [10M01] source is transmitting four messages with equal probability. Then,for optimum Source coding efficiency, [10M02] necessarily, variable length coding schemes should be used Variable length coding schemes need not necessarily be used Fixed length coding schemes should not be used Convolutional codes should be used 95. Which of the following is a valid source coding scheme for a source transmitting five messages? [10M0] 0,00,110,1110,1111 1,11,001,0001,0000 0,10,1110,110,1111 1,01,001,0010, In Modulo-7 addition, is equal to [10S01] Which of the following provides minimum redundancy in coding? [10S02] (6,) linear block code (15,11) linear block code (6,) systematic cyclic code Convolutional code 98. Which of the following involves the effect of the communication channel? [10S0] Information content of a message Entropy of the source Mutual information information rate of the source 99. Which of the following can be the generating polynomial for a (7,4) systematic Cyclic code? [10S04] 2 x +x +1 4 x +x x +x +1 +x x +x Which of the following provides the facility to recognize the error at the receiver? [10S05] Shanon-Fano Encoding

10 RQ FEC differential encoding Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free Which of the following coding schemes is linear? [11D01] C = { 00,01,10,} C = {000,111,110} C = {000,110,111,001} C = {111,110,011,101} If the transition probability of messages 0 and 1 in a communication system is 0.1, the noise matrix of the corresponding Communication channel is [11D02] 10. In a BSC, the rate of information transmission over the channel decreases as [11M01] transmission probability approaches 0.5 transmission probability approaches 1 transition probability approaches 0.5 transition probability approaches source X is connected to a receiver R through a lossless channel. Then [11M02] H(Y/X) = 0 H(X,Y) = 0 H(X) = I(X,Y) H(X/Y)= I(X,Y) 105. Which of the following is a valid source coding scheme for a source transmitting four messages? [11M0] 0,00,110,1110 1,11,001,0001 0,10,1110,110 1,01,001, The Hamming distance of a triple error correcting code is [11S01] channel whose i/p is xi and output is yj is deterministic if [11S02] 108. If a memoryless source of information rate R is connected to a channel with a channel capacity C, then on which of the following statements, the channel coding for the output of the source is based? [11S0] R must be greater than or equal to C R must be exactly equal to C R must be less than or equal to C Minimum number of bits required to encode the output of the source is its entropy 109. Which of the following is correct? [11S04] Source coding reduces transmission efficiency Channel coding improves transmission efficiency Entropy of a source is a measure of uncertainty of its output Cyclic code is an RQ scheme of error control The minimum source code word length of the message of a source is equal to [11S05] its entropy measured in bits/sec the channel capacity its entropy measured in bits/message the sampling rate required for the source If the transition probability of messages 0 and 1 in a communication system is 0.2, the noise matrix of the corresponding Communication channel is [12D01]

11 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 112. In a BSC, if the transition probability of the messages 0 and 1 is P, and if they are of equal transmission probability, then, the probability of these symbols to appear at the channel output is [12D02] P,P 1/2, 1/2 1,1 P,1 - P The number of bits to be used by the efficient source encoder to encode the output of the source is equal to [12M01] the information rate of the source entropy of the source channel capacity information content of each message source X is connected to a receiver R through a deterministic channel. Then [12M02] H(X/Y) = 0 H(Y/X) = 0 H(X) = I(X,Y) H(X/Y)= I(X,Y) Which of the following can be valid source coding scheme for a source transmitting messages? [12M0] 0,00,110 1,01,001 0,10,101 1,01, For an (n,k) cyclic code, E(x) is the error polynomial, g(x) is the generator Polynomial, R(x) is the received code polynomial and C(x) is the transmitted code polynomial. Then, the Syndrome polynomial S(x) is [12S01] 117. If is the input noise PSD and S is the input signal power for a communication channel of capacity C, then Which of the following is Shanon`s Limit? [12S02] Remainder of C(x)/g(x) Remainder of E(x)/g(x) E(x).g(x) R(x) + g(x) The Hamming distance of an error correcting code capable of correcting 4 errors is [12S0] BCH codes capable of correcting single error are [12S04] Systematic Linear Block codes Cyclic Hamming codes Convolutional codes Non-Systematic Linear Block codes 120. Which of the following provides the facility to recognize the error at the receiver? [12S05] Shanon-Fano Encoding Parity Check codes Huffman encoding differential encoding 121. The output of a source is a continuous random variable uniformly distributed over (0,2).The entropy of the source in bits/sample is [1D01] n WGN low pass channel with 4KHz band width is fed with white noise of PSD = 10 W/Hz. The two sided noise power at the output of the channel is [1D02] 4 nw 2 nw

12 6 nw 8 nw Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free system has a band width of KHz and an S/N ratio of 29dB at the input of the receiver. If the band width of the channel gets doubled, then [1M01] its capacity gets doubled the corresponding S/N ratio gets doubled its capacity gets halved the corresponding S/N ratio gets halved source is transmitting two symbols and B with probabilities 7/8 and 1/8 respectively. The average source code word length can be decreased by [1M02] reducing transmission probability increasing transmission probability using noise free channel using pair coding Non-Uniqueness of Huffman encoding results in [1M0] different coding efficiencies different average code word lengths different entropies different sets of source code words 126. Shanon's limit is for [1S01] maximum entropy of source maximum information rate of a source maximum information content of a message maximum capacity of a communication channel under infinite band width In Modulo-6 addition, the sum of 1 and 5 is [1S02] FEC and RQ schemes of error control can be applied for the outputs of [1S0] Binary symmetric channel only Binary Erasure channel only Binary Erasure channel, and Binary symmetric channel respectively Binary symmetric channel, and Binary erasure channel respectively 129. The Hamming distance of an (n,k) systematic cyclic code is [1S04] the weight of any non-zero code word the weight of a code word consisting of all 1`s the weight the code word consisting of alternate 1`s and 0`s the minimum of weights of all non zero code words of the code 10. Which of the following is effected by the communication channel? [1S05] Information content of a message Entropy of the source information rate of the Mutual information source 11. The maximum average amount of information content measured in bits/sec associated with the output of a discrete information source transmitting 8 messages and 2000messages/sec is [14D01] 6Kbps Kbps 16Kbps 4Kbps 12. Which of the following coding schemes is linear? [14D02] C = {00, 01,10,11} C = {01,10,11} C = {110,111,001} C = {000,110,011} communication source is connected to a receiver using a communication channel such that, the uncertainty about the transmitted at the receiver, after knowing the received is zero. Then, the information gained by the observer at the receiver is [14M01] same as the entropy of the source same as the entropy of the receiver same as the joint entropy of the source and the receiver same as the conditional entropy of the source, given the receiver X(t) and n(t) are the signal and the noise each is band limited to 2B Hz applied to a communication channel band limited to BHz. Then,the minimum number of samples/sec that should be transmitted to recover the input of the channel at its output is [14M02] 2B 4B B

13 6B Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free The upper limit on the minimum distance of a linear block code is [14S01] minimum weight of the non-zero code word of the code minimum number of errors that can be corrected maximum number of errors that can be corrected maximum weight of the non-zero code word of the code Information rate of a source can be used to [14S02] differentiate between two sources to find the entropy in bits/message of a source correct the errors at the receiving side design the matched filter for the receiver If source is transmitting only one message. Then [14S0] the reception of the message conveys zero information to the user the reception of the message conveys maximum information to the user the message received will be corrupted by noise the channel capacity required is infinite C is the code word and H is the Parity check Matrix of an (n,k) linear block code, then, [14S04] each code word of the (n,k) code is orthogonal to the code word of its dual code each code word of the (n,k) code is orthogonal to the code word of the same code C.H = [0] T H.C = 0 Theoritically, the entropy of a continuous random variable is [14S05] infinity zero unity finite, but >0 and >1. convolutional encoder is having a constraint length of 4 and for each input bit, a two bit word is the output of the encoder. If the input message is of length 5, the exact code rate of the encoder is [15D01] 50 % 1.25 % 45. % 2. % 141. In a message conveyed through a sequence of independent dots and dashes, the probability of occurrence of a dash is one third of that of dot.the information content of the word with two dashes in bits is [15D02] The voice frequency modulating signal of a PCM system is quantized into 16 levels. If the signal is band limited to KHz, the minimum symbol rate of the system is [15M01] 48 kilosymbols/sec 6 kilosymbols/sec 96 kilosymbols/sec kilosymbols/sec 14. source is transmitting four messages with probabilities 1/2, 1/4, 1/8 and 1/8. To have 100 % transmission efficiency, the average source code word length of the message of the source should be [15M02] 2 bits 1.75 bits bits.75 bits 144. source is transmitting six messages with probabilities, 1/2, 1/4, 1/8, 1/16,1/2, and 1/2.Then [15M0] Channel coding will reduce the average source code word length. Two different source code word sets can be obtained using Huffman coding. Source coding improves the error performance of the communication system. Two different source code word sets can be obtained using Shanon-Fano coding 145. The average source code word length per bit can be decreased by [15S01] increasing the entropy of the source extending the order of the source using a channel with very large capacity increasing the transmitted power Trade-off between Band width and Signal to Noise ratio results in [15S02] Shanon`s limit the concept of transmitting the given information using various combinations of signal power and band width a noise free channel minimum redundancy Binary erasure channel is an example of [15S0] Wide band channel

14 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free severe effect of channel noise on the message transmitted narrowband channel symmetric channel 148. In a symmetric channel, [15S04] transmission errors will be less rows and columns of the corresponding channel matrix are identical, except for permutation required transmission power will be less rows and columns of the corresponding channel matrix are identical on permutation basis 149. Which of the following is correct? [15S05] Mutual information of a communication system is same as the entropy of the source Mutual information of a communication system is the information gained by the observer Mutual information is independent of the channel characteristics Mutual information of a communication system is same as the entropy of the receiver 150. source generates three symbols with probabilities of 0.25, 0.25 and 0.5 at a rate of 000 symbols/se ssuming independent generation of symbols, the most efficient source encoder would have an average bit rate of [16D01] 6000 bps 4500 bps 000 bps 1500 bps 151. memoryless source emits 2000 binary symbols/sec and each symbol has a Probability of 0.25 to be equal to 1 and 0.75 to be equal to 0.Thre minimum number of bits/sec required for error free transmission of this source is [16D02] In a communication system, due to noise in the channel, an average of one symbol in each 100 received is incorrect. The symbol transmission rate is 1000.The number of bits in error in the received symbols is [16M01] Which of the following is a valid source coding scheme for a source transmitting Six messages? [16M02] 0, 10,110, 1100,1111, , 11, 101, 1100, 11010, ,10,110,1110,11110, ,10,100,1110,11110, The encoder of an (15,11) systematic cyclic code requires [16M0] 4 bit shift register and Modulo-2 adders. 4 bit shift register and 4 Modulo-2 adders. bit shift register and Modulo-2 adders. bit shift register and 4 Modulo-2 adders. The distance between the any code word and an all zero code word of an (n,k) linear Block code is referred to as [16S01] Hamming distance of the code Code rate of the code Redundancy of the code Hamming weight of the code word s per source coding Theorem, it is not possible to find any uniquely decodable code whose average length is [16S02] less than the entropy of the source greater than the entropy of the source equal to number of messages from the source equal to the efficiency of transmission of the source 157. The coding efficiency due to second order extension of a source [16S0] is more is less remains unaltered can not be computed Exchange between Band width and Signal noise ratio can be justified based on [16S04] Shanon`s limit Shanon`s source coding Theorem Hartley - Shanon`s Law Shanon`s channel coding Theorem source X is connected to a receiver Y through a noise free channel. Its capacity is [16S05] Maximum of H(X/Y) Maximum of H(X) Maximum of H(Y/X) Maximum of H(X,Y)

15 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 160. zero memory source emits two messages and B with probability of 0.8 and 0.2 respectively. The entropy of the second order extension of the source is [17D01] 1.56 bits/ message 0.72 bits/ message 0.78 bits/ message 1.44 bits/message 161. signal amplitude X is a uniform random variable in the range (-1,1).Its differential Entropy is [17D02] 2 bits/sample 4 bits/sample 1 bit/sample bits/sample 162. communication channel is so noisy that the output Y of the channel is statistically independent of the input X. Then, [17M01] H(X,Y) = H(X).H(Y) H(X,Y) = H(X)+H(Y) H(X/Y) = H(Y/X) H(X) = H(Y) 16. transmitting terminal has 128 characters and the data sent from the terminal consist of independent sequences of equiprobable characters. The entropy of the above terminal in bits/character is [17M02] For a (7,4) systematic Cyclic code, the generator polynomial is 1+x+x.Then, the Syndrome vector corresponding to the error pattern is [17M0] Which of the following is correct? [17S01] noise free channel is not a deterministic channel For a noise free channel, H(X/Y) = 1 The channel Matrix of a noise free channel is an Identity Matrix noise free channel is of infinite capacity In a communication system, information lost in the channel is measured using [17S02] H(X/Y) I(X,Y) H(X) H(X/Y) H(Y/X) 167. Capacity of a BSC with infinite band width is not infinity, because [17S0] Noise power in the channel and the band width varies linearly Noise power in the channel inversely varies with band width Noise power in the channel is independent of band width Noise power in the channel will not effect the signal power 168. For a noise free channel, I(X,Y) is equal to [17S04] entropy of the source, given the receiver entropy of the receiver, given the source entropy of the source joint entropy of the source and the receiver 169. The output of a continuous source is a Gaussian random variable with variance σ maximum entropy of the source is [17S05] 2 and is band limited to fm Hz. The If the generator polynomial of a (7,4) Non-systematic cyclic code is given as g(x) = 1+x+x +x, then the binary word 2 corresponding to x.g(x) + g(x) is [18D01] n (7,4) systematic cyclic code has a generator polynomial g(x) = 1+x+x,and the Code polynomial is V(x) = x+. Then, the remainder of the division V(x)/g(x) is [18D02] Zero syndrome vector

16 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free received code word received and corrected code wor The out put of a continuous source is a uniform random variable of (0,1).Then [18M01] the absolute entropy of the source is zero output of the source is a gaussian random variable the relative entropy of the source is zero the source is discrete memory less source The generator sequence of an adder in a convolutional encoder is (1,1,1,1).It is its response for an input sequence of [18M02] 0,1,0, 0,.. 0,0,1,0,0,0,1,0,... 1,0,0,0,.. 1,1,1,.. In a communication system, the average amount of uncertainty associated with the Source, sink, source and sink jointly in bits/message are 1.061,1.5 and 2.42 respectively. Then the information transferred by the channel connecting the source and sink in bits is [18M0] The efficiency of transmission of information can be measured by [18S01] comparing the entropy of the source and maximum limit for the rate of ns.c transmission of information over the channel. comparing the and maximum limit for the rate of transmission of information over the channel and the conditional entropy of receiver, given the source comparing the actual rate of transmission and maximum limit for the rate of transmission of information over the channel comparing the entropy of the source and the information content of each individual message of the source. Binary Erasure channel is the mathematical modeling of [18S02] the effect of channel noise resulting in the incorrect decision of the message bit transmitted the inability of the receiver to make a decision about of the received message bit in the back ground of noise the error correction mechanism at the receiving side error detection mechanism at the receiving side In which of the following matrices, the sum of each row is one? [18S0] Joint probability Matrix Channel Matrix Conditional probability of the source, given the receiver Generator Matrix If T is the code vector and H is the Parity check Matrix of a Linear Block code, then the code is defined by the set of all code vectors for which [18S04] T T.H = 0. H.T = 0 T H.T = 0 = 0. Which of the following is correct? [18S05] The entropy measure of a continuous source is not an absolute measure Binary symmetric channel is a noise free channel The channel capacity of a Symmetric channel is always 1 bit/symbol Self information and mutual information are one and the same. BSC has a transition probability of P. The cascade of two such channels is [19D01] an asymmetric channel with transition probability 2P 2 a symmetric channel with transition probability P. an asymmetric channel with transition probability P(1 - P) a symmetric channel with transition probability 2P(1 - P) 181. source is transmitting four messages with probabilities of 0.5, 0.25, 0.125and By using Huffman coding, the percentage reduction in the average source code word length is [19D02] 10 % 20 % 12.5 % 25 % The parity polynomial in the generation of a systematic (7,4) cyclic code for the data word is 2 1+x.The corresponding code word is [19M01] Which of the following are prefix free codes? [19M02]

17 0,01,011,0001 1,01,001,0001 0,1,00,11 00,01,00,11 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free Which of the following is a single error correcting perfect code? [19M0] (6,) systematic cyclic code (7,4) systematic Linear Block code (15,11) Hamming code Convolutional code of code rate 1/2 If X is the transmitted message and Y is the received message, then the average information content of the pair (X,Y) is equal to the average information of Y plus [19S01] average information of X the average information of Y after X is known the average information of X after Y is known mutual information of (X,Y) The entropy H( ) is [19S02] 187. If information X2 to some one who knows X1. information X2 with out knowing X1. Mutual information of X1 and X2 effect of noise in receiving X1 as X2. X and Y are the transmitter and the receiver, in a BSC, P(X = i/y=j) measures [19S0] uncertainty about the received bit based on the transmitted certainty about the received bit based on the transmitted certainty about the transmitted bit based on the received uncertainty about the transmitted bit based on the received 188. If X and Y are related in one-to-one manner, then, H(X/Y) in bits is [19S04] 1 log m, m being the number of messages with source If the output of the channel is independent of the input, then [19S05] maximum information is conveyed over the channel no information is transmitted over the channel no errors will occur during transmission information loss is zero 190. The Parity check matrix of a linear block code is.its Hamming distance is [20D01] source with equally likely outputs is connected to a communication channel with channel matrix.the columns of the matrix represent the probability that a transmitted bit is identified as 0, a transmitted bit unidentified, and a transmitted bit is identified as 1 respectively. Then, the probability that the bit is not identified is [20D02] The Hamming distance of the code vectors Ci and Cj is [20M01] weight of minimum of the weights of Ci and Cj weight of sum of the weights of Ci and Cj 19. The minimum number of parity bits required for the single error correcting linear block code for 11 data bits is [20M02] source X with symbol rate of 1000 symbols/sec is connected to a receiver Y using a BSC with transition probability P. The messages of the source are equally likely. Then, rate of information transmission over the channel in bits per sec is [20M0]

18 Send SMS s : ON<space>JntuSpeed To To Recieve Jntu Updates Daily On Your Mobile For Free 195. Entropy coding is a [20S01] variable length coding scheme fixed length coding scheme channel coding scheme differential coding scheme Which of the following is correct? [20S02] Mutual information is symmetric about transmitted and received pairs Binary Erasure channel is a symmetric channel Channel matrix gives the joint probabilities of the transmitted and received pairs Channel capacity of a noise free channel is zero. For a BSC with transition probability P, the bit error probability is [20S0] 1 - P P 2P 2(1 - P) (4,) Parity check code can [20S04] correct all single error patterns detect all double error patterns detect all triple error patterns correct all single error patterns and detect all double error patterns 199. source of information rate of 80 Kbps is connected to a communication channel of capacity 66.6 Kbps. Then [20S05] error free transmission is not possible channel coding results in error free transmission source coding will make the errors corrected at the receiver mutual information becomes maximum

DIGITAL COMMINICATIONS

DIGITAL COMMINICATIONS Code No: R346 R Set No: III B.Tech. I Semester Regular and Supplementary Examinations, December - 23 DIGITAL COMMINICATIONS (Electronics and Communication Engineering) Time: 3 Hours Max Marks: 75 Answer

More information

VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad ASSIGNMENT QUESTION BANK

VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad ASSIGNMENT QUESTION BANK VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad ASSIGNMENT QUESTION BANK Name of the subject: Digital Communications B.Tech/M.Tech/MCA/MBA Subject Code: A1424 Semester: VI

More information

QUESTION BANK (VI SEM ECE) (DIGITAL COMMUNICATION)

QUESTION BANK (VI SEM ECE) (DIGITAL COMMUNICATION) QUESTION BANK (VI SEM ECE) (DIGITAL COMMUNICATION) UNIT-I: PCM & Delta modulation system Q.1 Explain the difference between cross talk & intersymbol interference. Q.2 What is Quantization error? How does

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 14: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 25 th, 2015 1 Previous Lecture: Source Code Generation: Lossless

More information

DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS

DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK Subject Name: Year /Sem: II / IV UNIT I INFORMATION ENTROPY FUNDAMENTALS PART A (2 MARKS) 1. What is uncertainty? 2. What is prefix coding? 3. State the

More information

Problem Sheet 1 Probability, random processes, and noise

Problem Sheet 1 Probability, random processes, and noise Problem Sheet 1 Probability, random processes, and noise 1. If F X (x) is the distribution function of a random variable X and x 1 x 2, show that F X (x 1 ) F X (x 2 ). 2. Use the definition of the cumulative

More information

Revision of Lecture Eleven

Revision of Lecture Eleven Revision of Lecture Eleven Previous lecture we have concentrated on carrier recovery for QAM, and modified early-late clock recovery for multilevel signalling as well as star 16QAM scheme Thus we have

More information

INSTITUTE OF AERONAUTICAL ENGINEERING (Autonomous) Dundigal, Hyderabad

INSTITUTE OF AERONAUTICAL ENGINEERING (Autonomous) Dundigal, Hyderabad INSTITUTE OF AERONAUTICAL ENGINEERING (Autonomous) Dundigal, Hyderabad - 500 03 ELECTRONICS AND COMMUNICATION ENGINEERING TUTORIAL QUESTION BANK Name : DIGITAL COMMUNICATIONS Code : A6020 Class : III -

More information

Part A: Question & Answers UNIT I AMPLITUDE MODULATION

Part A: Question & Answers UNIT I AMPLITUDE MODULATION PANDIAN SARASWATHI YADAV ENGINEERING COLLEGE DEPARTMENT OF ELECTRONICS & COMMUNICATON ENGG. Branch: ECE EC6402 COMMUNICATION THEORY Semester: IV Part A: Question & Answers UNIT I AMPLITUDE MODULATION 1.

More information

6. FUNDAMENTALS OF CHANNEL CODER

6. FUNDAMENTALS OF CHANNEL CODER 82 6. FUNDAMENTALS OF CHANNEL CODER 6.1 INTRODUCTION The digital information can be transmitted over the channel using different signaling schemes. The type of the signal scheme chosen mainly depends on

More information

Outline. Communications Engineering 1

Outline. Communications Engineering 1 Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband channels Signal space representation Optimal

More information

Department of Electronics and Communication Engineering 1

Department of Electronics and Communication Engineering 1 UNIT I SAMPLING AND QUANTIZATION Pulse Modulation 1. Explain in detail the generation of PWM and PPM signals (16) (M/J 2011) 2. Explain in detail the concept of PWM and PAM (16) (N/D 2012) 3. What is the

More information

SCHEME OF COURSE WORK. Course Code : 13EC1114 L T P C : ELECTRONICS AND COMMUNICATION ENGINEERING

SCHEME OF COURSE WORK. Course Code : 13EC1114 L T P C : ELECTRONICS AND COMMUNICATION ENGINEERING SCHEME OF COURSE WORK Course Details: Course Title : DIGITAL COMMUNICATIONS Course Code : 13EC1114 L T P C 4 0 0 3 Program Specialization Semester Prerequisites Courses to which it is a prerequisite :

More information

Syllabus. osmania university UNIT - I UNIT - II UNIT - III CHAPTER - 1 : INTRODUCTION TO DIGITAL COMMUNICATION CHAPTER - 3 : INFORMATION THEORY

Syllabus. osmania university UNIT - I UNIT - II UNIT - III CHAPTER - 1 : INTRODUCTION TO DIGITAL COMMUNICATION CHAPTER - 3 : INFORMATION THEORY i Syllabus osmania university UNIT - I CHAPTER - 1 : INTRODUCTION TO Elements of Digital Communication System, Comparison of Digital and Analog Communication Systems. CHAPTER - 2 : DIGITAL TRANSMISSION

More information

Computing and Communications 2. Information Theory -Channel Capacity

Computing and Communications 2. Information Theory -Channel Capacity 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Channel Capacity Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Communication

More information

B. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided.

B. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided. " 11111111111111111111111111111111111111111111111111111111111111III *U-3091/8400* Printed Pages : 7 TEC - 601! I i B. Tech. (SEM. VI) EXAMINATION, 2007-08 DIGIT AL COMMUNICATION \ V Time: 3 Hours] [Total

More information

Comm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding

Comm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding Comm. 50: Communication Theory Lecture 6 - Introduction to Source Coding Digital Communication Systems Source of Information User of Information Source Encoder Source Decoder Channel Encoder Channel Decoder

More information

Digital Television Lecture 5

Digital Television Lecture 5 Digital Television Lecture 5 Forward Error Correction (FEC) Åbo Akademi University Domkyrkotorget 5 Åbo 8.4. Error Correction in Transmissions Need for error correction in transmissions Loss of data during

More information

UNIT I Source Coding Systems

UNIT I Source Coding Systems SIDDHARTH GROUP OF INSTITUTIONS: PUTTUR Siddharth Nagar, Narayanavanam Road 517583 QUESTION BANK (DESCRIPTIVE) Subject with Code: DC (16EC421) Year & Sem: III-B. Tech & II-Sem Course & Branch: B. Tech

More information

ECE 556 BASICS OF DIGITAL SPEECH PROCESSING. Assıst.Prof.Dr. Selma ÖZAYDIN Spring Term-2017 Lecture 2

ECE 556 BASICS OF DIGITAL SPEECH PROCESSING. Assıst.Prof.Dr. Selma ÖZAYDIN Spring Term-2017 Lecture 2 ECE 556 BASICS OF DIGITAL SPEECH PROCESSING Assıst.Prof.Dr. Selma ÖZAYDIN Spring Term-2017 Lecture 2 Analog Sound to Digital Sound Characteristics of Sound Amplitude Wavelength (w) Frequency ( ) Timbre

More information

EE 435/535: Error Correcting Codes Project 1, Fall 2009: Extended Hamming Code. 1 Introduction. 2 Extended Hamming Code: Encoding. 1.

EE 435/535: Error Correcting Codes Project 1, Fall 2009: Extended Hamming Code. 1 Introduction. 2 Extended Hamming Code: Encoding. 1. EE 435/535: Error Correcting Codes Project 1, Fall 2009: Extended Hamming Code Project #1 is due on Tuesday, October 6, 2009, in class. You may turn the project report in early. Late projects are accepted

More information

LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR

LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 1 LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 2 STORAGE SPACE Uncompressed graphics, audio, and video data require substantial storage capacity. Storing uncompressed video is not possible

More information

EE521 Analog and Digital Communications

EE521 Analog and Digital Communications EE521 Analog and Digital Communications Questions Problem 1: SystemView... 3 Part A (25%... 3... 3 Part B (25%... 3... 3 Voltage... 3 Integer...3 Digital...3 Part C (25%... 3... 4 Part D (25%... 4... 4

More information

Synchronization of Hamming Codes

Synchronization of Hamming Codes SYCHROIZATIO OF HAMMIG CODES 1 Synchronization of Hamming Codes Aveek Dutta, Pinaki Mukherjee Department of Electronics & Telecommunications, Institute of Engineering and Management Abstract In this report

More information

Entropy, Coding and Data Compression

Entropy, Coding and Data Compression Entropy, Coding and Data Compression Data vs. Information yes, not, yes, yes, not not In ASCII, each item is 3 8 = 24 bits of data But if the only possible answers are yes and not, there is only one bit

More information

EEE482F: Problem Set 1

EEE482F: Problem Set 1 EEE482F: Problem Set 1 1. A digital source emits 1.0 and 0.0V levels with a probability of 0.2 each, and +3.0 and +4.0V levels with a probability of 0.3 each. Evaluate the average information of the source.

More information

Physical Layer: Modulation, FEC. Wireless Networks: Guevara Noubir. S2001, COM3525 Wireless Networks Lecture 3, 1

Physical Layer: Modulation, FEC. Wireless Networks: Guevara Noubir. S2001, COM3525 Wireless Networks Lecture 3, 1 Wireless Networks: Physical Layer: Modulation, FEC Guevara Noubir Noubir@ccsneuedu S, COM355 Wireless Networks Lecture 3, Lecture focus Modulation techniques Bit Error Rate Reducing the BER Forward Error

More information

Chapter 1 Coding for Reliable Digital Transmission and Storage

Chapter 1 Coding for Reliable Digital Transmission and Storage Wireless Information Transmission System Lab. Chapter 1 Coding for Reliable Digital Transmission and Storage Institute of Communications Engineering National Sun Yat-sen University 1.1 Introduction A major

More information

Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication

Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication 1 Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING 1.1 SOURCE CODING Whether a source is analog or digital, a digital communication system is designed to transmit information in digital form.

More information

ECEn 665: Antennas and Propagation for Wireless Communications 131. s(t) = A c [1 + αm(t)] cos (ω c t) (9.27)

ECEn 665: Antennas and Propagation for Wireless Communications 131. s(t) = A c [1 + αm(t)] cos (ω c t) (9.27) ECEn 665: Antennas and Propagation for Wireless Communications 131 9. Modulation Modulation is a way to vary the amplitude and phase of a sinusoidal carrier waveform in order to transmit information. When

More information

Single Error Correcting Codes (SECC) 6.02 Spring 2011 Lecture #9. Checking the parity. Using the Syndrome to Correct Errors

Single Error Correcting Codes (SECC) 6.02 Spring 2011 Lecture #9. Checking the parity. Using the Syndrome to Correct Errors Single Error Correcting Codes (SECC) Basic idea: Use multiple parity bits, each covering a subset of the data bits. No two message bits belong to exactly the same subsets, so a single error will generate

More information

Modulation and Coding Tradeoffs

Modulation and Coding Tradeoffs 0 Modulation and Coding Tradeoffs Contents 1 1. Design Goals 2. Error Probability Plane 3. Nyquist Minimum Bandwidth 4. Shannon Hartley Capacity Theorem 5. Bandwidth Efficiency Plane 6. Modulation and

More information

Fundamentals of Digital Communication

Fundamentals of Digital Communication Fundamentals of Digital Communication Network Infrastructures A.A. 2017/18 Digital communication system Analog Digital Input Signal Analog/ Digital Low Pass Filter Sampler Quantizer Source Encoder Channel

More information

ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013

ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013 ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013 Lecture 18 Today: (1) da Silva Discussion, (2) Error Correction Coding, (3) Error Detection (CRC) HW 8 due Tue. HW 9 (on Lectures

More information

MATHEMATICS IN COMMUNICATIONS: INTRODUCTION TO CODING. A Public Lecture to the Uganda Mathematics Society

MATHEMATICS IN COMMUNICATIONS: INTRODUCTION TO CODING. A Public Lecture to the Uganda Mathematics Society Abstract MATHEMATICS IN COMMUNICATIONS: INTRODUCTION TO CODING A Public Lecture to the Uganda Mathematics Society F F Tusubira, PhD, MUIPE, MIEE, REng, CEng Mathematical theory and techniques play a vital

More information

COPYRIGHTED MATERIAL. Introduction. 1.1 Communication Systems

COPYRIGHTED MATERIAL. Introduction. 1.1 Communication Systems 1 Introduction The reliable transmission of information over noisy channels is one of the basic requirements of digital information and communication systems. Here, transmission is understood both as transmission

More information

Spreading Codes and Characteristics. Error Correction Codes

Spreading Codes and Characteristics. Error Correction Codes Spreading Codes and Characteristics and Error Correction Codes Global Navigational Satellite Systems (GNSS-6) Short course, NERTU Prasad Krishnan International Institute of Information Technology, Hyderabad

More information

Error Control Coding. Aaron Gulliver Dept. of Electrical and Computer Engineering University of Victoria

Error Control Coding. Aaron Gulliver Dept. of Electrical and Computer Engineering University of Victoria Error Control Coding Aaron Gulliver Dept. of Electrical and Computer Engineering University of Victoria Topics Introduction The Channel Coding Problem Linear Block Codes Cyclic Codes BCH and Reed-Solomon

More information

Waveform Encoding - PCM. BY: Dr.AHMED ALKHAYYAT. Chapter Two

Waveform Encoding - PCM. BY: Dr.AHMED ALKHAYYAT. Chapter Two Chapter Two Layout: 1. Introduction. 2. Pulse Code Modulation (PCM). 3. Differential Pulse Code Modulation (DPCM). 4. Delta modulation. 5. Adaptive delta modulation. 6. Sigma Delta Modulation (SDM). 7.

More information

Contents Preview and Introduction Waveform Encoding

Contents Preview and Introduction Waveform Encoding Contents 1 Preview and Introduction... 1 1.1 Process of Communication..... 1 1.2 General Definition of Signal..... 3 1.3 Time-Value Definition of Signals Analog and Digital..... 6 1.3.1 Continuous Time

More information

Solutions to Information Theory Exercise Problems 5 8

Solutions to Information Theory Exercise Problems 5 8 Solutions to Information Theory Exercise roblems 5 8 Exercise 5 a) n error-correcting 7/4) Hamming code combines four data bits b 3, b 5, b 6, b 7 with three error-correcting bits: b 1 = b 3 b 5 b 7, b

More information

END-OF-YEAR EXAMINATIONS ELEC321 Communication Systems (D2) Tuesday, 22 November 2005, 9:20 a.m. Three hours plus 10 minutes reading time.

END-OF-YEAR EXAMINATIONS ELEC321 Communication Systems (D2) Tuesday, 22 November 2005, 9:20 a.m. Three hours plus 10 minutes reading time. END-OF-YEAR EXAMINATIONS 2005 Unit: Day and Time: Time Allowed: ELEC321 Communication Systems (D2) Tuesday, 22 November 2005, 9:20 a.m. Three hours plus 10 minutes reading time. Total Number of Questions:

More information

Course Developer: Ranjan Bose, IIT Delhi

Course Developer: Ranjan Bose, IIT Delhi Course Title: Coding Theory Course Developer: Ranjan Bose, IIT Delhi Part I Information Theory and Source Coding 1. Source Coding 1.1. Introduction to Information Theory 1.2. Uncertainty and Information

More information

Communications I (ELCN 306)

Communications I (ELCN 306) Communications I (ELCN 306) c Samy S. Soliman Electronics and Electrical Communications Engineering Department Cairo University, Egypt Email: samy.soliman@cu.edu.eg Website: http://scholar.cu.edu.eg/samysoliman

More information

Lecture5: Lossless Compression Techniques

Lecture5: Lossless Compression Techniques Fixed to fixed mapping: we encoded source symbols of fixed length into fixed length code sequences Fixed to variable mapping: we encoded source symbols of fixed length into variable length code sequences

More information

QUESTION BANK SUBJECT: DIGITAL COMMUNICATION (15EC61)

QUESTION BANK SUBJECT: DIGITAL COMMUNICATION (15EC61) QUESTION BANK SUBJECT: DIGITAL COMMUNICATION (15EC61) Module 1 1. Explain Digital communication system with a neat block diagram. 2. What are the differences between digital and analog communication systems?

More information

Basics of Error Correcting Codes

Basics of Error Correcting Codes Basics of Error Correcting Codes Drawing from the book Information Theory, Inference, and Learning Algorithms Downloadable or purchasable: http://www.inference.phy.cam.ac.uk/mackay/itila/book.html CSE

More information

Communications Overhead as the Cost of Constraints

Communications Overhead as the Cost of Constraints Communications Overhead as the Cost of Constraints J. Nicholas Laneman and Brian. Dunn Department of Electrical Engineering University of Notre Dame Email: {jnl,bdunn}@nd.edu Abstract This paper speculates

More information

Introduction to Error Control Coding

Introduction to Error Control Coding Introduction to Error Control Coding 1 Content 1. What Error Control Coding Is For 2. How Coding Can Be Achieved 3. Types of Coding 4. Types of Errors & Channels 5. Types of Codes 6. Types of Error Control

More information

EC6501 Digital Communication

EC6501 Digital Communication EC6501 Digital Communication UNIT -1 DIGITAL COMMUNICATION SYSTEMS Digital Communication system 1) Write the advantages and disadvantages of digital communication. [A/M 11] The advantages of digital communication

More information

and coding (a.k.a. communication theory) Signals and functions Elementary operation of communication: send signal on

and coding (a.k.a. communication theory) Signals and functions Elementary operation of communication: send signal on Fundamentals of information transmission and coding (a.k.a. communication theory) Signals and functions Elementary operation of communication: send signal on medium from point A to point B. media copper

More information

ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013

ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013 ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013 Lecture 18 Today: (1) da Silva Discussion, (2) Error Correction Coding, (3) Error Detection (CRC) HW 8 due Tue. HW 9 (on Lectures

More information

Communications Theory and Engineering

Communications Theory and Engineering Communications Theory and Engineering Master's Degree in Electronic Engineering Sapienza University of Rome A.A. 2018-2019 Channel Coding The channel encoder Source bits Channel encoder Coded bits Pulse

More information

Performance of Reed-Solomon Codes in AWGN Channel

Performance of Reed-Solomon Codes in AWGN Channel International Journal of Electronics and Communication Engineering. ISSN 0974-2166 Volume 4, Number 3 (2011), pp. 259-266 International Research Publication House http://www.irphouse.com Performance of

More information

INSTITUTE OF AERONAUTICAL ENGINEERING

INSTITUTE OF AERONAUTICAL ENGINEERING INSTITUTE OF AERONAUTICAL ENGINEERING (AUTONOMUS) Dundigal, Hyderabad - 00 0 ELECTRONICS AND COMMUNICATION ENGINEERING TUTORIAL QUESTION BANK Name : DIGITAL COMMUNICATIONS Code : AEC009 Class : B. Tech

More information

Information Theory and Huffman Coding

Information Theory and Huffman Coding Information Theory and Huffman Coding Consider a typical Digital Communication System: A/D Conversion Sampling and Quantization D/A Conversion Source Encoder Source Decoder bit stream bit stream Channel

More information

FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY

FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY 1 Information Transmission Chapter 5, Block codes FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY 2 Methods of channel coding For channel coding (error correction) we have two main classes of codes,

More information

IMPERIAL COLLEGE of SCIENCE, TECHNOLOGY and MEDICINE, DEPARTMENT of ELECTRICAL and ELECTRONIC ENGINEERING.

IMPERIAL COLLEGE of SCIENCE, TECHNOLOGY and MEDICINE, DEPARTMENT of ELECTRICAL and ELECTRONIC ENGINEERING. IMPERIAL COLLEGE of SCIENCE, TECHNOLOGY and MEDICINE, DEPARTMENT of ELECTRICAL and ELECTRONIC ENGINEERING. COMPACT LECTURE NOTES on COMMUNICATION THEORY. Prof. Athanassios Manikas, version Spring 22 Digital

More information

Downloaded from 1

Downloaded from  1 VII SEMESTER FINAL EXAMINATION-2004 Attempt ALL questions. Q. [1] How does Digital communication System differ from Analog systems? Draw functional block diagram of DCS and explain the significance of

More information

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK SUB.NAME : COMMUNICATION THEORY SUB.CODE: EC1252 YEAR : II SEMESTER : IV UNIT I AMPLITUDE MODULATION SYSTEMS

More information

Error Protection: Detection and Correction

Error Protection: Detection and Correction Error Protection: Detection and Correction Communication channels are subject to noise. Noise distorts analog signals. Noise can cause digital signals to be received as different values. Bits can be flipped

More information

EE303: Communication Systems

EE303: Communication Systems EE303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London An Overview of Fundamentals: Channels, Criteria and Limits Prof. A. Manikas (Imperial

More information

MULTILEVEL CODING (MLC) with multistage decoding

MULTILEVEL CODING (MLC) with multistage decoding 350 IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 52, NO. 3, MARCH 2004 Power- and Bandwidth-Efficient Communications Using LDPC Codes Piraporn Limpaphayom, Student Member, IEEE, and Kim A. Winick, Senior

More information

Lab/Project Error Control Coding using LDPC Codes and HARQ

Lab/Project Error Control Coding using LDPC Codes and HARQ Linköping University Campus Norrköping Department of Science and Technology Erik Bergfeldt TNE066 Telecommunications Lab/Project Error Control Coding using LDPC Codes and HARQ Error control coding is an

More information

EEE 309 Communication Theory

EEE 309 Communication Theory EEE 309 Communication Theory Semester: January 2017 Dr. Md. Farhad Hossain Associate Professor Department of EEE, BUET Email: mfarhadhossain@eee.buet.ac.bd Office: ECE 331, ECE Building Types of Modulation

More information

International Journal of Engineering Research in Electronics and Communication Engineering (IJERECE) Vol 1, Issue 5, April 2015

International Journal of Engineering Research in Electronics and Communication Engineering (IJERECE) Vol 1, Issue 5, April 2015 Implementation of Error Trapping Techniqe In Cyclic Codes Using Lab VIEW [1] Aneetta Jose, [2] Hena Prince, [3] Jismy Tom, [4] Malavika S, [5] Indu Reena Varughese Electronics and Communication Dept. Amal

More information

Contents Chapter 1: Introduction... 2

Contents Chapter 1: Introduction... 2 Contents Chapter 1: Introduction... 2 1.1 Objectives... 2 1.2 Introduction... 2 Chapter 2: Principles of turbo coding... 4 2.1 The turbo encoder... 4 2.1.1 Recursive Systematic Convolutional Codes... 4

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 13: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 22 th, 2015 1 o Source Code Generation Lecture Outlines Source Coding

More information

Coding Techniques and the Two-Access Channel

Coding Techniques and the Two-Access Channel Coding Techniques and the Two-Access Channel A.J. Han VINCK Institute for Experimental Mathematics, University of Duisburg-Essen, Germany email: Vinck@exp-math.uni-essen.de Abstract. We consider some examples

More information

Error Control Codes. Tarmo Anttalainen

Error Control Codes. Tarmo Anttalainen Tarmo Anttalainen email: tarmo.anttalainen@evitech.fi.. Abstract: This paper gives a brief introduction to error control coding. It introduces bloc codes, convolutional codes and trellis coded modulation

More information

photons photodetector t laser input current output current

photons photodetector t laser input current output current 6.962 Week 5 Summary: he Channel Presenter: Won S. Yoon March 8, 2 Introduction he channel was originally developed around 2 years ago as a model for an optical communication link. Since then, a rather

More information

DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty

DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty Name: S.Kalpana, AP/ECE QUESTION BANK UNIT I ANALOG

More information

Volume 2, Issue 9, September 2014 International Journal of Advance Research in Computer Science and Management Studies

Volume 2, Issue 9, September 2014 International Journal of Advance Research in Computer Science and Management Studies Volume 2, Issue 9, September 2014 International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case Study Available online at: www.ijarcsms.com

More information

ELEC3028 (EL334) Digital Transmission

ELEC3028 (EL334) Digital Transmission ELEC3028 (EL334) Digital Transmission Half of the unit: Information Theory MODEM (modulator and demodulator) Professor Sheng Chen: Building 53, Room 4005 E-mail: sqc@ecs.soton.ac.uk Lecture notes from:

More information

Digital Communication Systems ECS 452

Digital Communication Systems ECS 452 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 2. Source Coding 1 Office Hours: BKD, 6th floor of Sirindhralai building Monday 10:00-10:40 Tuesday 12:00-12:40

More information

ECE 476/ECE 501C/CS Wireless Communication Systems Winter Lecture 9: Error Control Coding

ECE 476/ECE 501C/CS Wireless Communication Systems Winter Lecture 9: Error Control Coding ECE 476/ECE 501C/CS 513 - Wireless Communication Systems Winter 2005 Lecture 9: Error Control Coding Chapter 8 Coding and Error Control From: Wireless Communications and Networks by William Stallings,

More information

QUESTION BANK EC 1351 DIGITAL COMMUNICATION YEAR / SEM : III / VI UNIT I- PULSE MODULATION PART-A (2 Marks) 1. What is the purpose of sample and hold

QUESTION BANK EC 1351 DIGITAL COMMUNICATION YEAR / SEM : III / VI UNIT I- PULSE MODULATION PART-A (2 Marks) 1. What is the purpose of sample and hold QUESTION BANK EC 1351 DIGITAL COMMUNICATION YEAR / SEM : III / VI UNIT I- PULSE MODULATION PART-A (2 Marks) 1. What is the purpose of sample and hold circuit 2. What is the difference between natural sampling

More information

Hamming Codes as Error-Reducing Codes

Hamming Codes as Error-Reducing Codes Hamming Codes as Error-Reducing Codes William Rurik Arya Mazumdar Abstract Hamming codes are the first nontrivial family of error-correcting codes that can correct one error in a block of binary symbols.

More information

Time division multiplexing The block diagram for TDM is illustrated as shown in the figure

Time division multiplexing The block diagram for TDM is illustrated as shown in the figure CHAPTER 2 Syllabus: 1) Pulse amplitude modulation 2) TDM 3) Wave form coding techniques 4) PCM 5) Quantization noise and SNR 6) Robust quantization Pulse amplitude modulation In pulse amplitude modulation,

More information

Performance comparison of convolutional and block turbo codes

Performance comparison of convolutional and block turbo codes Performance comparison of convolutional and block turbo codes K. Ramasamy 1a), Mohammad Umar Siddiqi 2, Mohamad Yusoff Alias 1, and A. Arunagiri 1 1 Faculty of Engineering, Multimedia University, 63100,

More information

DEGRADED broadcast channels were first studied by

DEGRADED broadcast channels were first studied by 4296 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 9, SEPTEMBER 2008 Optimal Transmission Strategy Explicit Capacity Region for Broadcast Z Channels Bike Xie, Student Member, IEEE, Miguel Griot,

More information

EEE 309 Communication Theory

EEE 309 Communication Theory EEE 309 Communication Theory Semester: January 2016 Dr. Md. Farhad Hossain Associate Professor Department of EEE, BUET Email: mfarhadhossain@eee.buet.ac.bd Office: ECE 331, ECE Building Part 05 Pulse Code

More information

Level 6 Graduate Diploma in Engineering Communication systems

Level 6 Graduate Diploma in Engineering Communication systems 9210-118 Level 6 Graduate Diploma in Engineering Communication systems Sample Paper You should have the following for this examination one answer book non-programmable calculator pen, pencil, ruler, drawing

More information

Rab Nawaz. Prof. Zhang Wenyi

Rab Nawaz. Prof. Zhang Wenyi Rab Nawaz PhD Scholar (BL16006002) School of Information Science and Technology University of Science and Technology of China, Hefei Email: rabnawaz@mail.ustc.edu.cn Submitted to Prof. Zhang Wenyi wenyizha@ustc.edu.cn

More information

Chapter 10 Error Detection and Correction 10.1

Chapter 10 Error Detection and Correction 10.1 Data communication and networking fourth Edition by Behrouz A. Forouzan Chapter 10 Error Detection and Correction 10.1 Note Data can be corrupted during transmission. Some applications require that errors

More information

Problem Sheets: Communication Systems

Problem Sheets: Communication Systems Problem Sheets: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Department of Electrical & Electronic Engineering Imperial College London v.11 1 Topic: Introductory

More information

QUESTION BANK. Staff In-Charge: M.MAHARAJA, AP / ECE

QUESTION BANK. Staff In-Charge: M.MAHARAJA, AP / ECE FATIMA MICHAEL COLLEGE OF ENGINEERING & TECHNOLOGY Senkottai Village, Madurai Sivagangai Main Road, Madurai -625 020 An ISO 9001:2008 Certified Institution QUESTION BANK Sub. Code : EC 2301 Class : III

More information

Lecture #2. EE 471C / EE 381K-17 Wireless Communication Lab. Professor Robert W. Heath Jr.

Lecture #2. EE 471C / EE 381K-17 Wireless Communication Lab. Professor Robert W. Heath Jr. Lecture #2 EE 471C / EE 381K-17 Wireless Communication Lab Professor Robert W. Heath Jr. Preview of today s lecture u Introduction to digital communication u Components of a digital communication system

More information

VALLIAMMAI ENGINEERING COLLEGE

VALLIAMMAI ENGINEERING COLLEGE VALLIAMMAI ENGINEERING COLLEGE SRM Nagar, Kattankulathur 603 203. DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING QUESTION BANK SUBJECT : EC6402 COMMUNICATION THEORY SEM / YEAR: IV / II year B.E.

More information

Convolutional Coding Using Booth Algorithm For Application in Wireless Communication

Convolutional Coding Using Booth Algorithm For Application in Wireless Communication Available online at www.interscience.in Convolutional Coding Using Booth Algorithm For Application in Wireless Communication Sishir Kalita, Parismita Gogoi & Kandarpa Kumar Sarma Department of Electronics

More information

Fundamentals of Wireless Communication

Fundamentals of Wireless Communication Communication Technology Laboratory Prof. Dr. H. Bölcskei Sternwartstrasse 7 CH-8092 Zürich Fundamentals of Wireless Communication Homework 5 Solutions Problem 1 Simulation of Error Probability When implementing

More information

Digital Transmission using SECC Spring 2010 Lecture #7. (n,k,d) Systematic Block Codes. How many parity bits to use?

Digital Transmission using SECC Spring 2010 Lecture #7. (n,k,d) Systematic Block Codes. How many parity bits to use? Digital Transmission using SECC 6.02 Spring 2010 Lecture #7 How many parity bits? Dealing with burst errors Reed-Solomon codes message Compute Checksum # message chk Partition Apply SECC Transmit errors

More information

Lecture 3: Data Transmission

Lecture 3: Data Transmission Lecture 3: Data Transmission 1 st semester 1439-2017 1 By: Elham Sunbu OUTLINE Data Transmission DATA RATE LIMITS Transmission Impairments Examples DATA TRANSMISSION The successful transmission of data

More information

Basic Concepts in Data Transmission

Basic Concepts in Data Transmission Basic Concepts in Data Transmission EE450: Introduction to Computer Networks Professor A. Zahid A.Zahid-EE450 1 Data and Signals Data is an entity that convey information Analog Continuous values within

More information

Digital Communication Systems ECS 452

Digital Communication Systems ECS 452 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 5. Channel Coding 1 Office Hours: BKD, 6th floor of Sirindhralai building Tuesday 14:20-15:20 Wednesday 14:20-15:20

More information

Summary of Basic Concepts

Summary of Basic Concepts Transmission Summary of Basic Concepts Sender Channel Receiver Dr. Christian Rohner Encoding Modulation Demodulation Decoding Bits Symbols Noise Terminology Communications Research Group Bandwidth [Hz]

More information

Theory of Telecommunications Networks

Theory of Telecommunications Networks Theory of Telecommunications Networks Anton Čižmár Ján Papaj Department of electronics and multimedia telecommunications CONTENTS Preface... 5 1 Introduction... 6 1.1 Mathematical models for communication

More information

Theory of Telecommunications Networks

Theory of Telecommunications Networks Theory of Telecommunications Networks Anton Čižmár Ján Papaj Department of electronics and multimedia telecommunications CONTENTS Preface... 5 1 Introduction... 6 1.1 Mathematical models for communication

More information

Error-Correcting Codes

Error-Correcting Codes Error-Correcting Codes Information is stored and exchanged in the form of streams of characters from some alphabet. An alphabet is a finite set of symbols, such as the lower-case Roman alphabet {a,b,c,,z}.

More information

Lecture 3 Data Link Layer - Digital Data Communication Techniques

Lecture 3 Data Link Layer - Digital Data Communication Techniques DATA AND COMPUTER COMMUNICATIONS Lecture 3 Data Link Layer - Digital Data Communication Techniques Mei Yang Based on Lecture slides by William Stallings 1 ASYNCHRONOUS AND SYNCHRONOUS TRANSMISSION timing

More information