Communications Overhead as the Cost of Constraints
|
|
- Melissa Carr
- 5 years ago
- Views:
Transcription
1 Communications Overhead as the Cost of Constraints J. Nicholas Laneman and Brian. Dunn Department of Electrical Engineering University of Notre Dame Abstract This paper speculates on a perspective for studying overhead in communication systems that contrasts the traditional viewpoint that overhead is the non-data portion of transmissions. By viewing overhead as the cost of constraints imposed on a system, information-theoretic techniques can be used to obtain fundamental limits on overhead information, and multiple constraints lead to an intriguing chain rule for overhead. In principle, protocol overhead in practical implementations can then be benchmarked against these fundamental limits in order to identify opportunities for improvement. Several examples are discussed within this developing framework. I. INTRODUCTION Because overhead can reduce the efficiency of a protocol, it is often considered a cost on the system. But it is rarely the case that the efficiency of a protocol can be improved simply by replacing overhead bits with data bits. Consider a protocol that encodes a user s messages onto n-bit packets for transmission over a noisy channel. Some of the drawbacks associated with considering overhead to be the non-data portion of a packet are: Non-data bits may be explicitly required to decode data bits. For example, a portion of the n-bit packet may be used to specify the rate of a forward error correction (FEC) code used to encode the user s message onto the remaining bits. Without knowledge of the FEC code s rate, the decoder does not know the size of the message that was sent and the message cannot be decoded. By replacing non-data bits with additional data bits, the probability of error may increase. For example, if a systematic FEC code is used to transmit k data bits, replacing the error-control bits with additional data bits will increase the probability of error for the original k data bits. It can be meaningless to explicitly distinguish data and non-data bits. For example, many systematic FEC codes have non-systematic equivalents that provide identical error-control performance; from the non-data viewpoint, the parity bits of the systematic code would be considered overhead, but it is less natural to define overhead from this perspective for the non-systematic code. The conclusion we draw from these observations is that defining what portion of a protocol is data versus what portion is overhead may not be the most relevant distinction. If the purpose of distinguishing overhead from data is to understand what gains may be realized by an improved protocol with a lower overhead cost, it seems preferable to define overhead explicitly as such. Accordingly, we consider overhead cost to be the reduction in system performance that results from a constraint on the design of a protocol or system. This paper establishes an operational definition for the overhead cost of a system constraint as the difference between baseline system performance and constrained system performance. This perspective was initially developed in [], and was inspired by Gallager s treatment of protocol information in []. Loosely speaking, Gallager s approach in [] involves applying a constraint to a source coding problem and identifying the resulting increase in rate as the protocol overhead for the system. In a channel coding problem, we might expect that additional constraints would decrease the rate of communication, and it could be natural to identify this decrease in rate as the protocol overhead. These observations suggests a broader theme for communications overhead that we attempt to develop in the sequel. The remainder of this paper is organized as follows. Section II defines block codes for a communication channel and exemplifies a number of constraints on encoders and decoders. Section III recalls the definition of channel capacity and defines overhead cost of a constraint in terms of channel capacity. Section IV provides a few computed examples and illustrations. II. CODES AND CONSTRAINTS In this section, we establish notation for channels, codes, and a variety of constraints that we explore in later sections. Consider a channel modeled by the sequence of conditional distributions Y n X n(yn x n ), on the inputs X n X n and outputs Y n Y n, n =,,... For integers M,N >, an (M,N) code consists of a message set W := {,,...,M} an encoder f : W X N a decoder g : Y N W The rate of an (M,N) code is R := log (M)/N bits per channel use. The average probability of error for an (M,N) code over channel Yn X n(yn x n ) with message W W distributed according to W (w) is [g(y N ) = W] computed over the joint distribution W (w) Xn W(x n w) Yn X n(yn x n ) with X n W(x n w) =if x n = f(w).
2 For an (M,N) code with w W, let x N (w) = f(w) denote the vector of outputs of the encoder, and let x k (w) denote the k-th element of x N (w), k =,,...,N. We will use f(w), x N (w), and x k (w) in different contexts to refer to the encoder. Let Γ(M,N) denote the set of all (M,N) codes. In the sequel, we will constrain this set of codes in various ways. If the indexes are clear from the context, we drop them and simply denote this set by Γ. A. Encoding Constraints Encoding constraints are common in communication systems. Several examples that we discuss in some detail include: An input constraint for a subset S X, denoted Γ S, restricts the encoder so that f : W S N. An average power constraint, denoted Γ, restricts the encoder so that N k= x k(w) /N for each w W. A repetition coding constraint of order L, where L> is an integer, denoted Γ RE,L. Repetition coding restricts an encoder so that x kl+l+ (w) = x kl+ (w) for l =,,...,L, k =,,,..., and all w W. That is, symbols occur in runs of length L in the output of the encoder. More generally, we can consider linear block codes, convolutional codes, and concatenated codes as imposing constraints on the encoder of an (M,N) code. Such involved examples are of course important, but beyond the scope of this paper. B. Decoding Constraints Decoding constraints are also common, though we tend to emphasize them less explicitly than encoding constraints. Several examples that we discuss in detail include: Maximum a posteriori (MA) decoding, denoted Γ MA, restricts the decoder to the form g MA (y N ) = arg max w W W Y N (w yn ), which depends upon an a priori distribution W (w) on the encoded message, the encoder f, and the channel law Y N X N (yn x N ). It is well known that MA decoding minimizes the average probability of error. Maximum likelihood (ML) decoding, denoted Γ ML, restricts the decoder to the form g ML (y N ) = arg max w W Y n X N (yn f(w)), which depends upon the encoder f and channel law Y N X N (yn x N ). It is well known that ML decoding corresponds to MA decoding if W (w) is a uniform distribution; therefore, ML decoding minimizes the average probability of error in this case. Joint typicality (JT) decoding, denoted Γ JT, restricts the decoder to the form g JT (y N )= min w W {w :(f(w), yn ) T(X N, Y N )} where T(X N,Y N ) is the jointly typical set for the joint distribution X N (x n ) YN X N (yn x N ) [3]. Here the distribution X N (x N ) is arbitrary, at least in principle, and the decoder depends upon it, the encoder f, and the channel law YN X N (yn x N ). Hard-decision decoding (HDD), denoted Γ HDD, corresponds to marginally estimating X k as ˆX k from Y k, i.e., symbol-by-symbol demodulation, and then applying some form of decoding to ˆX N to detect W. C. Compatible Constraints From our discussion of encoding and decoding constraints above, it should be clear that we can consider multiple constraints to restrict the class of (M,N) codes. However, it makes sense to ensure that multiple constraints are compatible. This motivates the following definition. Definition : Two constraints Γ and Γ are compatible if the set of (M,N) codes satisfying both constraints is nonempty, i.e., Γ Γ =. Unless we state otherwise, two or more constraints imposed on the same code are assumed to be compatible. III. CAACITY AND OVERHEAD COST In this section, we define a notion of overhead cost for a code constraint Γ Γ. We formulate this definition relative to channel capacity, but emphasize that overhead cost could be formulated in terms of other (fundamental) performance metrics. Definition : A rate R is achievable subject to constraint Γ Γ if there exists a sequence of ( NR,N) codes satisfying constraint Γ with average error probability tending to zero as N tends to infinity. Definition 3: The channel capacity subject to constraint Γ Γ, denoted C Γ, is the supremum of the rates achievable subject to constraint Γ. For Γ =Γ, achievability subject to Γ and C Γ correspond to the conventional definitions of achievability and channel capacity, respectively. Definition 4: The overhead cost of constraint Γ Γ, denoted O Γ, is defined as O Γ := C Γ C Γ. Clearly, we are treating overhead cost as a rate of information, which may not be appropriate in all settings. This treatment works if the performance metric is channel capacity, or -capacity for a given > [4], [5]. We stress that we have defined overhead cost in terms of the operational definition of channel capacity. This is important because, depending upon the complexity of the constraint, we may have different representations of the information capacity of the channel. For example, a general formula for channel capacity subject to general constraints is given in [4], [5]. An additive constraint over a memoryless channel leads to singleletter expression for the channel capacity [4], [5].
3 A. Interpretation of Coding Constraints as Overhead Cost Among the example constraints we have mentioned, it is perhaps easiest to interpret repetition coding Γ RE,L as inducing overhead cost. On a discrete noiseless channel, i.e., Y n = X n, repetition coding of order L > has overhead cost L L log X. We will show an example in Section IV for which O ΓRE,L varies between and this maximal value as a function of the channel parameters. Consider the AWGN channel with X = R, average power constraint Γ, and input constraint S = {+, }. It is less conventional to interpret the input constraint as inducing overhead cost, but fundamentally there is no difference between this constraint and the type of constraint introduced for repetition or other codes. For Γ Γ corresponding to any of the decoding constraints described in Section II-B, if O Γ >, then we can interpret this overhead as additional redundancy in the encoding required to ensure reliable decoding by the constrained decoder Γ. It is well known, however, that both maximum-likelihood and joint-typicality decoding achieve the capacity [6], [3], so that O ΓML = O ΓJT =. We will provide an example in Section IV for which O ΓHDD >. B. Overhead is Additive There are many situations in which we may want to impose more than one constraint on a code. Definition 5: The overhead cost of constraint Γ Γ relative to constraint Γ Γ, denoted O Γ Γ is defined as O Γ Γ := C Γ C Γ Γ. If Γ Γ, then O Γ Γ =, and, in particular, O Γ Γ =. On the other hand, if Γ Γ, O Γ Γ = C Γ C Γ, and, in particular, O Γ Γ = O Γ. With these definitions, we have a notion of additivity for overhead and relative overhead. Specifically, we have the following chain rule. roposition (Chain Rule for Overhead): O Γ Γ = O Γ + O Γ Γ = O Γ + O Γ Γ roof: From the definition of overhead cost, O Γ Γ = C Γ C Γ Γ =(C Γ C Γ )+(C Γ C Γ Γ ) = O Γ + O Γ Γ Adding and subtracting C Γ instead of C Γ yields the other direction. p p p p Fig.. The binary symmetric channel (BSC) with crossover probability p. p p ( p) ( p) p( p) p( p) {, } Fig.. Illustration of two uses of a binary symmetric channel (BSC) with crossover probability p under repetition coding of order L =as a single use of the binary symmetric erasure channel (BSEC). IV. EXAMLES In this section we give a few simple examples to illustrate how the conceptual framework established in the previous section can be used to compute overhead cost and to highlight how it differs from the non-data viewpoint on overhead. A. Binary Symmetric Channel with Repetition Coding of Order L = Consider communication over the binary symmetric channel (BSC), with crossover probability p, shown in Figure. The capacity of the BSC is given by C BSC (p) = h(p), () where h(p) := p log p ( p) log ( p) denotes the binary entropy function. In order to model redundancy that has been added by a higher layer, assume that the encoder must operate subject to a repetition coding constraint in which each pair of consecutive inputs to the channel are two identical symbols, i.e., Γ RE,. Two uses of the BSC with the same input symbol are equivalent to a single use of the binary symmetric erasure channel (BSEC) shown in Figure. By symmetry of the BSEC, a uniform input distribution is optimal and the constrained capacity can be computed as C ΓRE, (p) = h(3) p( p), p( p) h(3) (p, ( p) ),
4 . Capacity (bits per channel use) Overhead Cost C Γ (p): BSC capacity with repetition coding constraint C(p): Unconstrained BSC capacity Crossover probability, p Rate b/ch use C BSK C BSK,HDD O HDD BSK O BSK Fig. 3. Illustration of the overhead cost of a repetition coding constraint for the binary symmetric channel with crossover probably p. From the non-data viewpoint, the overhead cost is.5 bits per channel use for all p. where Fig. 4. channel / (db) Overhead costs of BSK and BSK with HDD for the AWGN h (3) (p,p ):= p log p p log p ( p p ) log ( p p ). The overhead cost of a repetition coding constraint for the BSC as a function of the crossover probability p is therefore given by O ΓRE, (p) = h(p) h(3) p( p), p( p) + h(3) (p, ( p) ). The baseline performance C Γ (p) and the constrained performance C ΓRE, (p) are shown in Figure 3. The overhead cost O ΓRE, (p) between them tends to zero as p approaches.5, which is in contrast to the fixed overhead cost of.5 bits per channel use for all p under the non-data viewpoint. B. Additive White Gaussian Noise Channel with Multiple Constraints Consider communication over the additive white Gaussian noise (AWGN) channel Y k = X k + Z k, where Z k is iid Gaussian N(, ). Since capacity is infinite without constraints, we start with an average power constraint Γ and corresponding capacity = log + σ as our baseline. It is relatively easy to treat a repetition coding constraint of order L on the AWGN channel, because an optimal receiver can simple average the L received values Y kl+, Y kl+,...,y kl+l for each distinct input symbol x kl+ (W), k =,,... roducing this sufficient statistic yields an equivalent channel of the form Yk = LX k + Z k with a faction /L of the channel uses, with X k still subject to average power constraint, and with Z k N(, ). Thus, the C ΓRE,L = L L iid Gaussian L The overhead O ΓRE,L = C AWGN σ L on the AWGN channel behaves similarly to the BSC case discussed earlier: the overhead tends to zero as / tends to zero, and increases as / increases. An easy pair of constraints to treat on the AWGN channel is the combination of BSK signaling and HDD, i.e., input constraint Γ {+, } and decoder constraint Γ HDD. BSK and HDD convert the AWGN channel into a BSC with crossover probability p = Q σ, where Q(x) := + π e t / dt. Thus, x C BSK,HDD = h Q. Finally, imposing only the BSK input constraint Γ {+, } leads to the capacity C BSK = R + + R, where R(x) := + e (t x) / log +e xt dt. π Figure 4 shows the AWGN channel capacity, the BSK constrained capacity, and the BSK and HDD constrained capacity. Arrows indicate the BSK overhead cost O BSK = CBSK and the overhead cost of HDD relative to BSK O HDD BSK = C BSK C BSK,HDD. It is interesting to note that for /σ
5 Rate b/ch use C BSK C RE,3 C BSK,RE O RE BSK O BSK RE / (db) ACKNOWLEDGMENT This work has been supported in part by NSF grants CCF and CNS REFERENCES [] B.. Dunn, Overhead in Communication Systems as the Cost of Constraints, h.d. dissertation, University of Notre Dame, Notre Dame, IN, Dec.. [Online]. Available: jnl/pubs/ bdunn-phd-nd-.pdf [] R. G. Gallager, Basic Limits on rotocol Information in Data Communication Networks, IEEE Trans. Inform. Theory, vol., no. 4, pp , Jul [3] T. M. Cover and J. A. Thomas, Elements of Information Theory. New York: John Wiley & Sons, Inc., 99. [4] S. Verdú and T. S. Han, A General Formula for Channel Capacity, IEEE Trans. Inform. Theory, vol. 4, no. 5, pp , Jul [5] T. S. Han, Information Spectrum Methods in Information Theory. Berlin: Springer, 3. [6] R. G. Gallager, Information Theory and Reliable Communication. New York: John Wiley & Sons, Inc., 968. Fig. 5. Overhead costs of BSK, repetition coding, and BSK and repetition coding for the AWGN channel. below roughly db, O BSK and above roughly db O HDD BSK. This figure illustrates the utility of the chain rule for overhead, as we can isolate which constraints independently or relative to other constraints dominate the total overhead costs. Figure 5 shows the AWGN channel capacity, the BSK constrained capacity, the repetition coding capacity for L =3, and the combined BSK and repetition coding capacity for L =3. It is interesting to note that for / less than roughly 3 db, O RE BSK <O BSK RE, and vice versa for / greater than roughly 3 db.
EE 435/535: Error Correcting Codes Project 1, Fall 2009: Extended Hamming Code. 1 Introduction. 2 Extended Hamming Code: Encoding. 1.
EE 435/535: Error Correcting Codes Project 1, Fall 2009: Extended Hamming Code Project #1 is due on Tuesday, October 6, 2009, in class. You may turn the project report in early. Late projects are accepted
More informationDEGRADED broadcast channels were first studied by
4296 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 9, SEPTEMBER 2008 Optimal Transmission Strategy Explicit Capacity Region for Broadcast Z Channels Bike Xie, Student Member, IEEE, Miguel Griot,
More informationphotons photodetector t laser input current output current
6.962 Week 5 Summary: he Channel Presenter: Won S. Yoon March 8, 2 Introduction he channel was originally developed around 2 years ago as a model for an optical communication link. Since then, a rather
More informationHow (Information Theoretically) Optimal Are Distributed Decisions?
How (Information Theoretically) Optimal Are Distributed Decisions? Vaneet Aggarwal Department of Electrical Engineering, Princeton University, Princeton, NJ 08544. vaggarwa@princeton.edu Salman Avestimehr
More informationOutline. Communications Engineering 1
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband channels Signal space representation Optimal
More informationComputing and Communications 2. Information Theory -Channel Capacity
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Channel Capacity Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Communication
More informationCombined Modulation and Error Correction Decoder Using Generalized Belief Propagation
Combined Modulation and Error Correction Decoder Using Generalized Belief Propagation Graduate Student: Mehrdad Khatami Advisor: Bane Vasić Department of Electrical and Computer Engineering University
More informationIntroduction to Error Control Coding
Introduction to Error Control Coding 1 Content 1. What Error Control Coding Is For 2. How Coding Can Be Achieved 3. Types of Coding 4. Types of Errors & Channels 5. Types of Codes 6. Types of Error Control
More informationDecoding of Block Turbo Codes
Decoding of Block Turbo Codes Mathematical Methods for Cryptography Dedicated to Celebrate Prof. Tor Helleseth s 70 th Birthday September 4-8, 2017 Kyeongcheol Yang Pohang University of Science and Technology
More informationPerformance of Combined Error Correction and Error Detection for very Short Block Length Codes
Performance of Combined Error Correction and Error Detection for very Short Block Length Codes Matthias Breuninger and Joachim Speidel Institute of Telecommunications, University of Stuttgart Pfaffenwaldring
More informationIEEE C /02R1. IEEE Mobile Broadband Wireless Access <http://grouper.ieee.org/groups/802/mbwa>
23--29 IEEE C82.2-3/2R Project Title Date Submitted IEEE 82.2 Mobile Broadband Wireless Access Soft Iterative Decoding for Mobile Wireless Communications 23--29
More informationCT-516 Advanced Digital Communications
CT-516 Advanced Digital Communications Yash Vasavada Winter 2017 DA-IICT Lecture 17 Channel Coding and Power/Bandwidth Tradeoff 20 th April 2017 Power and Bandwidth Tradeoff (for achieving a particular
More informationOn the Achievable Diversity-vs-Multiplexing Tradeoff in Cooperative Channels
On the Achievable Diversity-vs-Multiplexing Tradeoff in Cooperative Channels Kambiz Azarian, Hesham El Gamal, and Philip Schniter Dept of Electrical Engineering, The Ohio State University Columbus, OH
More informationDigital Television Lecture 5
Digital Television Lecture 5 Forward Error Correction (FEC) Åbo Akademi University Domkyrkotorget 5 Åbo 8.4. Error Correction in Transmissions Need for error correction in transmissions Loss of data during
More informationHigh-Rate Non-Binary Product Codes
High-Rate Non-Binary Product Codes Farzad Ghayour, Fambirai Takawira and Hongjun Xu School of Electrical, Electronic and Computer Engineering University of KwaZulu-Natal, P. O. Box 4041, Durban, South
More informationCapacity-Achieving Rateless Polar Codes
Capacity-Achieving Rateless Polar Codes arxiv:1508.03112v1 [cs.it] 13 Aug 2015 Bin Li, David Tse, Kai Chen, and Hui Shen August 14, 2015 Abstract A rateless coding scheme transmits incrementally more and
More informationMULTILEVEL CODING (MLC) with multistage decoding
350 IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 52, NO. 3, MARCH 2004 Power- and Bandwidth-Efficient Communications Using LDPC Codes Piraporn Limpaphayom, Student Member, IEEE, and Kim A. Winick, Senior
More informationUsing TCM Techniques to Decrease BER Without Bandwidth Compromise. Using TCM Techniques to Decrease BER Without Bandwidth Compromise. nutaq.
Using TCM Techniques to Decrease BER Without Bandwidth Compromise 1 Using Trellis Coded Modulation Techniques to Decrease Bit Error Rate Without Bandwidth Compromise Written by Jean-Benoit Larouche INTRODUCTION
More informationCOPYRIGHTED MATERIAL. Introduction. 1.1 Communication Systems
1 Introduction The reliable transmission of information over noisy channels is one of the basic requirements of digital information and communication systems. Here, transmission is understood both as transmission
More informationError Control Coding. Aaron Gulliver Dept. of Electrical and Computer Engineering University of Victoria
Error Control Coding Aaron Gulliver Dept. of Electrical and Computer Engineering University of Victoria Topics Introduction The Channel Coding Problem Linear Block Codes Cyclic Codes BCH and Reed-Solomon
More informationTHE idea behind constellation shaping is that signals with
IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 52, NO. 3, MARCH 2004 341 Transactions Letters Constellation Shaping for Pragmatic Turbo-Coded Modulation With High Spectral Efficiency Dan Raphaeli, Senior Member,
More informationBlock Markov Encoding & Decoding
1 Block Markov Encoding & Decoding Deqiang Chen I. INTRODUCTION Various Markov encoding and decoding techniques are often proposed for specific channels, e.g., the multi-access channel (MAC) with feedback,
More informationSHANNON S source channel separation theorem states
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 9, SEPTEMBER 2009 3927 Source Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Member, IEEE, Elza Erkip, Senior Member,
More informationThe fundamentals of detection theory
Advanced Signal Processing: The fundamentals of detection theory Side 1 of 18 Index of contents: Advanced Signal Processing: The fundamentals of detection theory... 3 1 Problem Statements... 3 2 Detection
More information6. FUNDAMENTALS OF CHANNEL CODER
82 6. FUNDAMENTALS OF CHANNEL CODER 6.1 INTRODUCTION The digital information can be transmitted over the channel using different signaling schemes. The type of the signal scheme chosen mainly depends on
More informationOFDM Transmission Corrupted by Impulsive Noise
OFDM Transmission Corrupted by Impulsive Noise Jiirgen Haring, Han Vinck University of Essen Institute for Experimental Mathematics Ellernstr. 29 45326 Essen, Germany,. e-mail: haering@exp-math.uni-essen.de
More informationHamming Codes as Error-Reducing Codes
Hamming Codes as Error-Reducing Codes William Rurik Arya Mazumdar Abstract Hamming codes are the first nontrivial family of error-correcting codes that can correct one error in a block of binary symbols.
More informationPower Allocation Tradeoffs in Multicarrier Authentication Systems
Power Allocation Tradeoffs in Multicarrier Authentication Systems Paul L. Yu, John S. Baras, and Brian M. Sadler Abstract Physical layer authentication techniques exploit signal characteristics to identify
More informationRevision of Lecture Eleven
Revision of Lecture Eleven Previous lecture we have concentrated on carrier recovery for QAM, and modified early-late clock recovery for multilevel signalling as well as star 16QAM scheme Thus we have
More informationMaximum Likelihood Sequence Detection (MLSD) and the utilization of the Viterbi Algorithm
Maximum Likelihood Sequence Detection (MLSD) and the utilization of the Viterbi Algorithm Presented to Dr. Tareq Al-Naffouri By Mohamed Samir Mazloum Omar Diaa Shawky Abstract Signaling schemes with memory
More informationCapacity-Approaching Bandwidth-Efficient Coded Modulation Schemes Based on Low-Density Parity-Check Codes
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 9, SEPTEMBER 2003 2141 Capacity-Approaching Bandwidth-Efficient Coded Modulation Schemes Based on Low-Density Parity-Check Codes Jilei Hou, Student
More informationPolar Codes for Magnetic Recording Channels
Polar Codes for Magnetic Recording Channels Aman Bhatia, Veeresh Taranalli, Paul H. Siegel, Shafa Dahandeh, Anantha Raman Krishnan, Patrick Lee, Dahua Qin, Moni Sharma, and Teik Yeo University of California,
More informationIndex Terms Deterministic channel model, Gaussian interference channel, successive decoding, sum-rate maximization.
3798 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 6, JUNE 2012 On the Maximum Achievable Sum-Rate With Successive Decoding in Interference Channels Yue Zhao, Member, IEEE, Chee Wei Tan, Member,
More informationSpace-Time Coded Cooperative Multicasting with Maximal Ratio Combining and Incremental Redundancy
Space-Time Coded Cooperative Multicasting with Maximal Ratio Combining and Incremental Redundancy Aitor del Coso, Osvaldo Simeone, Yeheskel Bar-ness and Christian Ibars Centre Tecnològic de Telecomunicacions
More informationPerformance of Single-tone and Two-tone Frequency-shift Keying for Ultrawideband
erformance of Single-tone and Two-tone Frequency-shift Keying for Ultrawideband Cheng Luo Muriel Médard Electrical Engineering Electrical Engineering and Computer Science, and Computer Science, Massachusetts
More informationChapter 3 Convolutional Codes and Trellis Coded Modulation
Chapter 3 Convolutional Codes and Trellis Coded Modulation 3. Encoder Structure and Trellis Representation 3. Systematic Convolutional Codes 3.3 Viterbi Decoding Algorithm 3.4 BCJR Decoding Algorithm 3.5
More informationA New Coding Scheme for the Noisy-Channel Slepian-Wolf Problem: Separate Design and Joint Decoding
A New Coding Scheme for the Noisy-Channel Slepian-Wolf Problem: Separate Design and Joint Decoding Ruiyuan Hu, Ramesh Viswanathan and Jing (Tiffany) Li Electrical and Computer Engineering Dept, Lehigh
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 1: Introduction & Overview Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Information Theory 1 / 26 Outline 1 Course Information 2 Course Overview
More informationConstruction of Efficient Amplitude Phase Shift Keying Constellations
Construction of Efficient Amplitude Phase Shift Keying Constellations Christoph Schmitz Institute for Theoretical Information Technology RWTH Aachen University 20 Aachen, Germany schmitz@umic.rwth-aachen.de
More informationChapter 1 Coding for Reliable Digital Transmission and Storage
Wireless Information Transmission System Lab. Chapter 1 Coding for Reliable Digital Transmission and Storage Institute of Communications Engineering National Sun Yat-sen University 1.1 Introduction A major
More informationPerformance comparison of convolutional and block turbo codes
Performance comparison of convolutional and block turbo codes K. Ramasamy 1a), Mohammad Umar Siddiqi 2, Mohamad Yusoff Alias 1, and A. Arunagiri 1 1 Faculty of Engineering, Multimedia University, 63100,
More informationIntro to coding and convolutional codes
Intro to coding and convolutional codes Lecture 11 Vladimir Stojanović 6.973 Communication System Design Spring 2006 Massachusetts Institute of Technology 802.11a Convolutional Encoder Rate 1/2 convolutional
More informationCommunications Theory and Engineering
Communications Theory and Engineering Master's Degree in Electronic Engineering Sapienza University of Rome A.A. 2018-2019 Channel Coding The channel encoder Source bits Channel encoder Coded bits Pulse
More informationStudy of Turbo Coded OFDM over Fading Channel
International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 3, Issue 2 (August 2012), PP. 54-58 Study of Turbo Coded OFDM over Fading Channel
More informationThe Z Channel. Nihar Jindal Department of Electrical Engineering Stanford University, Stanford, CA
The Z Channel Sriram Vishwanath Dept. of Elec. and Computer Engg. Univ. of Texas at Austin, Austin, TX E-mail : sriram@ece.utexas.edu Nihar Jindal Department of Electrical Engineering Stanford University,
More informationOptimized Codes for the Binary Coded Side-Information Problem
Optimized Codes for the Binary Coded Side-Information Problem Anne Savard, Claudio Weidmann ETIS / ENSEA - Université de Cergy-Pontoise - CNRS UMR 8051 F-95000 Cergy-Pontoise Cedex, France Outline 1 Introduction
More informationSNR Estimation in Nakagami-m Fading With Diversity Combining and Its Application to Turbo Decoding
IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 50, NO. 11, NOVEMBER 2002 1719 SNR Estimation in Nakagami-m Fading With Diversity Combining Its Application to Turbo Decoding A. Ramesh, A. Chockalingam, Laurence
More informationThe BICM Capacity of Coherent Continuous-Phase Frequency Shift Keying
The BICM Capacity of Coherent Continuous-Phase Frequency Shift Keying Rohit Iyer Seshadri, Shi Cheng and Matthew C. Valenti Lane Dept. of Computer Sci. and Electrical Eng. West Virginia University Morgantown,
More informationAN INTRODUCTION TO ERROR CORRECTING CODES Part 2
AN INTRODUCTION TO ERROR CORRECTING CODES Part Jack Keil Wolf ECE 54 C Spring BINARY CONVOLUTIONAL CODES A binary convolutional code is a set of infinite length binary sequences which satisfy a certain
More informationLossy Compression of Permutations
204 IEEE International Symposium on Information Theory Lossy Compression of Permutations Da Wang EECS Dept., MIT Cambridge, MA, USA Email: dawang@mit.edu Arya Mazumdar ECE Dept., Univ. of Minnesota Twin
More informationOptimized Degree Distributions for Binary and Non-Binary LDPC Codes in Flash Memory
Optimized Degree Distributions for Binary and Non-Binary LDPC Codes in Flash Memory Kasra Vakilinia, Dariush Divsalar*, and Richard D. Wesel Department of Electrical Engineering, University of California,
More informationLab/Project Error Control Coding using LDPC Codes and HARQ
Linköping University Campus Norrköping Department of Science and Technology Erik Bergfeldt TNE066 Telecommunications Lab/Project Error Control Coding using LDPC Codes and HARQ Error control coding is an
More informationSIMULATIONS OF ERROR CORRECTION CODES FOR DATA COMMUNICATION OVER POWER LINES
SIMULATIONS OF ERROR CORRECTION CODES FOR DATA COMMUNICATION OVER POWER LINES Michelle Foltran Miranda Eduardo Parente Ribeiro mifoltran@hotmail.com edu@eletrica.ufpr.br Departament of Electrical Engineering,
More informationPerformance Analysis of Cognitive Radio based on Cooperative Spectrum Sensing
Performance Analysis of Cognitive Radio based on Cooperative Spectrum Sensing Sai kiran pudi 1, T. Syama Sundara 2, Dr. Nimmagadda Padmaja 3 Department of Electronics and Communication Engineering, Sree
More informationIntroduction to Coding Theory
Coding Theory Massoud Malek Introduction to Coding Theory Introduction. Coding theory originated with the advent of computers. Early computers were huge mechanical monsters whose reliability was low compared
More informationNotes 15: Concatenated Codes, Turbo Codes and Iterative Processing
16.548 Notes 15: Concatenated Codes, Turbo Codes and Iterative Processing Outline! Introduction " Pushing the Bounds on Channel Capacity " Theory of Iterative Decoding " Recursive Convolutional Coding
More informationEXTENDED CONSTRAINED VITERBI ALGORITHM FOR AIS SIGNALS RECEIVED BY SATELLITE
EXTENDED CONSTRAINED VITERBI ALGORITHM FOR AIS SIGNALS RECEIVED BY SATELLITE Raoul Prévost 1,2, Martial Coulon 1, David Bonacci 2, Julia LeMaitre 3, Jean-Pierre Millerioux 3 and Jean-Yves Tourneret 1 1
More informationA New Adaptive Two-Stage Maximum- Likelihood Decoding Algorithm for Linear Block Codes
IEEE TRANSACTIONS ON COMMUNICATIONS 0 A New Adaptive Two-Stage Maximum- Likelihood Decoding Algorithm for Linear Block Codes Xianren Wu 1, Hamid R. Sadjadpour 2 (contact author) and Zhi Tian 1 Suggested
More informationDepartment of Electronic Engineering FINAL YEAR PROJECT REPORT
Department of Electronic Engineering FINAL YEAR PROJECT REPORT BEngECE-2009/10-- Student Name: CHEUNG Yik Juen Student ID: Supervisor: Prof.
More information4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix
Send SMS s : ONJntuSpeed To 9870807070 To Recieve Jntu Updates Daily On Your Mobile For Free www.strikingsoon.comjntu ONLINE EXMINTIONS [Mid 2 - dc] http://jntuk.strikingsoon.com 1. Two binary random
More informationSymbol-by-Symbol MAP Decoding of Variable Length Codes
Symbol-by-Symbol MA Decoding of Variable Length Codes Rainer Bauer and Joachim Hagenauer Institute for Communications Engineering (LNT) Munich University of Technology (TUM) e-mail: Rainer.Bauer@ei.tum.de,
More informationSpreading Codes and Characteristics. Error Correction Codes
Spreading Codes and Characteristics and Error Correction Codes Global Navigational Satellite Systems (GNSS-6) Short course, NERTU Prasad Krishnan International Institute of Information Technology, Hyderabad
More informationCooperative Diversity in Wireless Networks: Efficient Protocols and Outage Behavior
IEEE TRANS. INFORM. THEORY Cooperative Diversity in Wireless Networks: Efficient Protocols and Outage Behavior J. Nicholas Laneman, Member, IEEE, David N. C. Tse, Senior Member, IEEE, and Gregory W. Wornell,
More informationarxiv: v2 [eess.sp] 10 Sep 2018
Designing communication systems via iterative improvement: error correction coding with Bayes decoder and codebook optimized for source symbol error arxiv:1805.07429v2 [eess.sp] 10 Sep 2018 Chai Wah Wu
More informationCHAPTER 4 SIGNAL SPACE. Xijun Wang
CHAPTER 4 SIGNAL SPACE Xijun Wang WEEKLY READING 1. Goldsmith, Wireless Communications, Chapters 5 2. Gallager, Principles of Digital Communication, Chapter 5 2 DIGITAL MODULATION AND DEMODULATION n Digital
More informationChannel Coding/Decoding. Hamming Method
Channel Coding/Decoding Hamming Method INFORMATION TRANSFER ACROSS CHANNELS Sent Received messages symbols messages source encoder Source coding Channel coding Channel Channel Source decoder decoding decoding
More informationEFFECTS OF PHASE AND AMPLITUDE ERRORS ON QAM SYSTEMS WITH ERROR- CONTROL CODING AND SOFT DECISION DECODING
Clemson University TigerPrints All Theses Theses 8-2009 EFFECTS OF PHASE AND AMPLITUDE ERRORS ON QAM SYSTEMS WITH ERROR- CONTROL CODING AND SOFT DECISION DECODING Jason Ellis Clemson University, jellis@clemson.edu
More information5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010
5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Interference Channels With Correlated Receiver Side Information Nan Liu, Member, IEEE, Deniz Gündüz, Member, IEEE, Andrea J.
More informationSNR Scalability, Multiple Descriptions, and Perceptual Distortion Measures
SNR Scalability, Multiple Descriptions, Perceptual Distortion Measures Jerry D. Gibson Department of Electrical & Computer Engineering University of California, Santa Barbara gibson@mat.ucsb.edu Abstract
More informationBasics of Error Correcting Codes
Basics of Error Correcting Codes Drawing from the book Information Theory, Inference, and Learning Algorithms Downloadable or purchasable: http://www.inference.phy.cam.ac.uk/mackay/itila/book.html CSE
More informationMultiple-Bases Belief-Propagation for Decoding of Short Block Codes
Multiple-Bases Belief-Propagation for Decoding of Short Block Codes Thorsten Hehn, Johannes B. Huber, Stefan Laendner, Olgica Milenkovic Institute for Information Transmission, University of Erlangen-Nuremberg,
More informationNEXT generation wireless communications systems are
1 Performance, Complexity, and Receiver Design for Code-Aided Frame Synchronization in Multipath Channels Daniel J. Jakubisin, Student Member, IEEE and R. Michael Buehrer, Senior Member, IEEE Abstract
More informationSymbol-Index-Feedback Polar Coding Schemes for Low-Complexity Devices
Symbol-Index-Feedback Polar Coding Schemes for Low-Complexity Devices Xudong Ma Pattern Technology Lab LLC, U.S.A. Email: xma@ieee.org arxiv:20.462v2 [cs.it] 6 ov 202 Abstract Recently, a new class of
More informationcode V(n,k) := words module
Basic Theory Distance Suppose that you knew that an English word was transmitted and you had received the word SHIP. If you suspected that some errors had occurred in transmission, it would be impossible
More informationProblem Sheet 1 Probability, random processes, and noise
Problem Sheet 1 Probability, random processes, and noise 1. If F X (x) is the distribution function of a random variable X and x 1 x 2, show that F X (x 1 ) F X (x 2 ). 2. Use the definition of the cumulative
More informationDigital Communications I: Modulation and Coding Course. Term Catharina Logothetis Lecture 12
Digital Communications I: Modulation and Coding Course Term 3-8 Catharina Logothetis Lecture Last time, we talked about: How decoding is performed for Convolutional codes? What is a Maximum likelihood
More informationORTHOGONAL space time block codes (OSTBC) from
1104 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 3, MARCH 2009 On Optimal Quasi-Orthogonal Space Time Block Codes With Minimum Decoding Complexity Haiquan Wang, Member, IEEE, Dong Wang, Member,
More informationScheduling in omnidirectional relay wireless networks
Scheduling in omnidirectional relay wireless networks by Shuning Wang A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of Applied Science
More informationECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013
ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013 Lecture 18 Today: (1) da Silva Discussion, (2) Error Correction Coding, (3) Error Detection (CRC) HW 8 due Tue. HW 9 (on Lectures
More informationA Survey of Advanced FEC Systems
A Survey of Advanced FEC Systems Eric Jacobsen Minister of Algorithms, Intel Labs Communication Technology Laboratory/ Radio Communications Laboratory July 29, 2004 With a lot of material from Bo Xia,
More informationNoncoherent Demodulation for Cooperative Diversity in Wireless Systems
Noncoherent Demodulation for Cooperative Diversity in Wireless Systems Deqiang Chen and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame IN 46556 Email: {dchen
More informationENERGY EFFICIENT RELAY SELECTION SCHEMES FOR COOPERATIVE UNIFORMLY DISTRIBUTED WIRELESS SENSOR NETWORKS
ENERGY EFFICIENT RELAY SELECTION SCHEMES FOR COOPERATIVE UNIFORMLY DISTRIBUTED WIRELESS SENSOR NETWORKS WAFIC W. ALAMEDDINE A THESIS IN THE DEPARTMENT OF ELECTRICAL AND COMPUTER ENGINEERING PRESENTED IN
More informationPairwise Optimization of Modulation Constellations for Non-Uniform Sources
Pairwise Optimization of Modulation Constellations for Non-Uniform Sources by Brendan F.D. Moore A thesis submitted to the Department of Mathematics and Statistics in conformity with the requirements for
More informationCOMBINED TRELLIS CODED QUANTIZATION/CONTINUOUS PHASE MODULATION (TCQ/TCCPM)
COMBINED TRELLIS CODED QUANTIZATION/CONTINUOUS PHASE MODULATION (TCQ/TCCPM) Niyazi ODABASIOGLU 1, OnurOSMAN 2, Osman Nuri UCAN 3 Abstract In this paper, we applied Continuous Phase Frequency Shift Keying
More informationCommunication Theory II
Communication Theory II Lecture 14: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 25 th, 2015 1 Previous Lecture: Source Code Generation: Lossless
More informationMultirate DSP, part 3: ADC oversampling
Multirate DSP, part 3: ADC oversampling Li Tan - May 04, 2008 Order this book today at www.elsevierdirect.com or by calling 1-800-545-2522 and receive an additional 20% discount. Use promotion code 92562
More informationARQ strategies for MIMO eigenmode transmission with adaptive modulation and coding
ARQ strategies for MIMO eigenmode transmission with adaptive modulation and coding Elisabeth de Carvalho and Petar Popovski Aalborg University, Niels Jernes Vej 2 9220 Aalborg, Denmark email: {edc,petarp}@es.aau.dk
More informationVolume 2, Issue 9, September 2014 International Journal of Advance Research in Computer Science and Management Studies
Volume 2, Issue 9, September 2014 International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case Study Available online at: www.ijarcsms.com
More informationCoding Schemes for an Erasure Relay Channel
Coding Schemes for an Erasure Relay Channel Srinath Puducheri, Jörg Kliewer, and Thomas E. Fuja Department of Electrical Engineering, University of Notre Dame, Notre Dame, IN 46556, USA Email: {spuduche,
More informationWIRELESS or wired link failures are of a nonergodic nature
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 7, JULY 2011 4187 Robust Communication via Decentralized Processing With Unreliable Backhaul Links Osvaldo Simeone, Member, IEEE, Oren Somekh, Member,
More informationFrequency-Hopped Spread-Spectrum
Chapter Frequency-Hopped Spread-Spectrum In this chapter we discuss frequency-hopped spread-spectrum. We first describe the antijam capability, then the multiple-access capability and finally the fading
More informationProtocol Coding for Two-Way Communications with Half-Duplex Constraints
Protocol Coding for Two-Way Communications with Half-Duplex Constraints Petar Popovski and Osvaldo Simeone Department of Electronic Systems, Aalborg University, Denmark CWCSPR, ECE Dept., NJIT, USA Email:
More informationPerformance Optimization of Hybrid Combination of LDPC and RS Codes Using Image Transmission System Over Fading Channels
European Journal of Scientific Research ISSN 1450-216X Vol.35 No.1 (2009), pp 34-42 EuroJournals Publishing, Inc. 2009 http://www.eurojournals.com/ejsr.htm Performance Optimization of Hybrid Combination
More informationOn the Design of Finite-State Shaping Encoders for Partial-Response Channels
On the Design of Finite-State Shaping Encoders for Partial-Response Channels Joseph B. Soriaga 2 and Paul H. Siegel Center for Magnetic Recording Research University of California, San Diego Information
More informationOn Iterative Multistage Decoding of Multilevel Codes for Frequency Selective Channels
On terative Multistage Decoding of Multilevel Codes for Frequency Selective Channels B.Baumgartner, H-Griesser, M.Bossert Department of nformation Technology, University of Ulm, Albert-Einstein-Allee 43,
More informationPrecoding and Signal Shaping for Digital Transmission
Precoding and Signal Shaping for Digital Transmission Robert F. H. Fischer The Institute of Electrical and Electronics Engineers, Inc., New York WILEY- INTERSCIENCE A JOHN WILEY & SONS, INC., PUBLICATION
More informationNONCOHERENT COMMUNICATION THEORY FOR COOPERATIVE DIVERSITY IN WIRELESS NETWORKS. A Thesis. Submitted to the Graduate School
NONCOHERENT COMMUNICATION THEORY FOR COOPERATIVE DIVERSITY IN WIRELESS NETWORKS A Thesis Submitted to the Graduate School of the University of Notre Dame in Partial Fulfillment of the Requirements for
More informationOptimal Coded Information Network Design and Management via Improved Characterizations of the Binary Entropy Function
Optimal Coded Information Network Design and Management via Improved Characterizations of the Binary Entropy Function John MacLaren Walsh & Steven Weber Department of Electrical and Computer Engineering
More informationMIMO Channel Capacity in Co-Channel Interference
MIMO Channel Capacity in Co-Channel Interference Yi Song and Steven D. Blostein Department of Electrical and Computer Engineering Queen s University Kingston, Ontario, Canada, K7L 3N6 E-mail: {songy, sdb}@ee.queensu.ca
More informationAntennas and Propagation. Chapter 6b: Path Models Rayleigh, Rician Fading, MIMO
Antennas and Propagation b: Path Models Rayleigh, Rician Fading, MIMO Introduction From last lecture How do we model H p? Discrete path model (physical, plane waves) Random matrix models (forget H p and
More information