Computing and Communications 2. Information Theory -Channel Capacity

Similar documents
Block Markov Encoding & Decoding

ECE 4400:693 - Information Theory

Communication Theory II

ECE Advanced Communication Theory, Spring 2007 Midterm Exam Monday, April 23rd, 6:00-9:00pm, ELAB 325

Coding Techniques and the Two-Access Channel

Scheduling in omnidirectional relay wireless networks

The Z Channel. Nihar Jindal Department of Electrical Engineering Stanford University, Stanford, CA

DEGRADED broadcast channels were first studied by

Error Control Coding. Aaron Gulliver Dept. of Electrical and Computer Engineering University of Victoria

Entropy, Coding and Data Compression

Multicasting over Multiple-Access Networks

Communication Theory II

Information Theory and Communication Optimal Codes

Introduction to Coding Theory

Comm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding

ECEn 665: Antennas and Propagation for Wireless Communications 131. s(t) = A c [1 + αm(t)] cos (ω c t) (9.27)

Time division multiplexing The block diagram for TDM is illustrated as shown in the figure

Pulse Code Modulation

Punctured vs Rateless Codes for Hybrid ARQ

Rab Nawaz. Prof. Zhang Wenyi

LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR

SHANNON S source channel separation theorem states

5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

COPYRIGHTED MATERIAL. Introduction. 1.1 Communication Systems

Wireless Network Information Flow

The ternary alphabet is used by alternate mark inversion modulation; successive ones in data are represented by alternating ±1.

MATHEMATICS IN COMMUNICATIONS: INTRODUCTION TO CODING. A Public Lecture to the Uganda Mathematics Society

4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix

Causal state amplification

Communications Overhead as the Cost of Constraints

photons photodetector t laser input current output current

Information Theory and Huffman Coding

A Bit of network information theory

Chapter 1 Coding for Reliable Digital Transmission and Storage

Variable-Rate Channel Capacity

Coding for the Slepian-Wolf Problem With Turbo Codes

Digital Television Lecture 5

WIRELESS or wired link failures are of a nonergodic nature

Error Detection and Correction: Parity Check Code; Bounds Based on Hamming Distance

# 12 ECE 253a Digital Image Processing Pamela Cosman 11/4/11. Introductory material for image compression

On Secure Signaling for the Gaussian Multiple Access Wire-Tap Channel

Volume 2, Issue 9, September 2014 International Journal of Advance Research in Computer Science and Management Studies

4118 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 12, DECEMBER Zhiyu Yang, Student Member, IEEE, and Lang Tong, Fellow, IEEE

Physical Layer: Outline

Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology

Lecture 13 February 23

6.450: Principles of Digital Communication 1

Computer Science 1001.py. Lecture 25 : Intro to Error Correction and Detection Codes

Error Performance of Channel Coding in Random-Access Communication

Basics of Error Correcting Codes

EE 435/535: Error Correcting Codes Project 1, Fall 2009: Extended Hamming Code. 1 Introduction. 2 Extended Hamming Code: Encoding. 1.

Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless

TRADITIONAL code design is often targeted at a specific

DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS

Introduction to Source Coding

S Coding Methods (5 cr) P. Prerequisites. Literature (1) Contents

Capacity and Optimal Resource Allocation for Fading Broadcast Channels Part I: Ergodic Capacity

Digital Communication Systems ECS 452

EELE 6333: Wireless Commuications

IMPERIAL COLLEGE of SCIENCE, TECHNOLOGY and MEDICINE, DEPARTMENT of ELECTRICAL and ELECTRONIC ENGINEERING.

3432 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 10, OCTOBER 2007

The idea of similarity is through the Hamming

THE use of balanced codes is crucial for some information

CT111 Introduction to Communication Systems Lecture 9: Digital Communications

Communications and Signals Processing

Lecture 17 Components Principles of Error Control Borivoje Nikolic March 16, 2004.

Communications I (ELCN 306)

Convolutional Coding Using Booth Algorithm For Application in Wireless Communication

arxiv: v2 [eess.sp] 10 Sep 2018

Coding for Noisy Networks

Channel Coding/Decoding. Hamming Method

Frequency hopping does not increase anti-jamming resilience of wireless channels

Introduction. Chapter Basics of communication

Symmetric Decentralized Interference Channels with Noisy Feedback

Outline. Communications Engineering 1

Diversity Gain Region for MIMO Fading Multiple Access Channels

FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY

Frequency-Hopped Spread-Spectrum

Lecture 1 Introduction

Introduction to Error Control Coding

Massive MIMO: Signal Structure, Efficient Processing, and Open Problems I

Digital Communication Systems ECS 452

Capacity-Approaching Bandwidth-Efficient Coded Modulation Schemes Based on Low-Density Parity-Check Codes

COMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam

Module 3 Greedy Strategy

Beating Burnashev in delay with noisy feedback

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 7, JULY This channel model has also been referred to as unidirectional cooperation

A Brief Introduction to Information Theory and Lossless Coding

ECE 556 BASICS OF DIGITAL SPEECH PROCESSING. Assıst.Prof.Dr. Selma ÖZAYDIN Spring Term-2017 Lecture 2

DIGITAL COMMUNICATION

Information flow over wireless networks: a deterministic approach

Communications IB Paper 6 Handout 3: Digitisation and Digital Signals

Performance of Reed-Solomon Codes in AWGN Channel

Performance of Single-tone and Two-tone Frequency-shift Keying for Ultrawideband

Lecture 3: Data Transmission

IEEE C /02R1. IEEE Mobile Broadband Wireless Access <

Fountain Codes. Gauri Joshi, Joong Bum Rhim, John Sun, Da Wang. December 8, 2010

Encoding of Control Information and Data for Downlink Broadcast of Short Packets

Solutions to Information Theory Exercise Problems 5 8

Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication

Transcription:

1896 1920 1987 2006 Computing and Communications 2. Information Theory -Channel Capacity Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1

Outline Communication system Examples of channel capacity Symmetric channels Properties of channel capacity Definitions Channel coding theorem Source-channel coding theorem 2

Reference Elements of information theory, T. M. Cover and J. A. Thomas, Wiley 3

CHANNEL CAPACITY 4

Communication System map source symbols from finite alphabet into some sequence of channel symbols, i.e., input sequence of channel output sequence of channel is random but has a distribution depending on input sequence of channel two different input sequences may give rise to same output sequence, i.e., inputs are confusable choose a nonconfusable subset of input sequences so that with high probability there is only one highly likely input that could have caused the particular output attempt to recover transmitted message from output sequence of channel reconstruct input sequences with a negligible probability of error 5

Channel Capacity 6

EXAMPLES OF CHANNEL CAPACITY 7

Noiseless Binary Channel Binary input is reproduced exactly at output C = max I(X; Y) = 1 bit, achieved using p(x) = (1/2, 1/2) one error-free bit can be transmitted per channel use 8

Noisy Channel with Nonoverlapping Outputs Two possible outputs corresponding to each of the two inputs appear to be noisy, but really not C = max I(X; Y) = 1 bit, achieved using p(x) = (1/2, 1/2) input can be determined from the output every transmitted bit can be recovered without error 9

Noisy Typewriter Channel input is either unchanged with probability 1/2 or is transformed into the next letter with probability 1/2 If the input has 26 symbols and we use every alternate input symbol, we can transmit one of 13 symbols without error with each transmission C = max I(X; Y) = max (H(Y) H(Y X)) = max H(Y) 1 = log 26 1 = log 13 achieved using p(x) = (1/26,, 1/26) 10

Binary Symmetric Channel Input symbols are complemented with probability p \\\ equality is achieved when the input distribution is uniform 11

Binary Erasure Channel Two inputs and three outputs, a fraction of bits are erased Xx achieved when π=1/2 Recover at most 1-α of bits, as α of bits are lost 12

SYMMETRIC CHANNELS 13

Symmetric example of symmetric channel 14

Proof 15

PROPERTIES OF CHANNEL CAPACITY 16

Properties of Channel Capacity C 0 since I( X ; Y ) 0 C log since C max I( X ; Y ) max H ( X ) log C log since C max I( X ; Y ) max H ( Y ) log I( X ; Y ) is a continuous function of p(x) I( X ; Y ) is a concave function of p(x) Problem for computing channel capacity is a convex problem maximization of a bounded concave function over a closed convex set maximum can then be found by standard nonlinear optimization techniques such as gradient search 17

DEFINITIONS 18

Discrete Memoryless Channel (DMC) 19

Code 20

Probability of Error 21

Rate and Capacity write (2 nr, n) codes to mean ( 2 nr, n) codes to simplify the notation 22

CHANNEL CODING THEOREM (SHANNON S SECOND THEOREM) 23

Basic Idea For large block lengths, every channel has a subset of inputs producing disjoint sequences at the output Ensure that no two input X sequences produce the same output Y sequence, to determine which X sequence was sent 24

Basic Idea Total number of possible output Y sequences is 2 nh(y) Divide into sets of size 2 nh(y X) corresponding to the different input X sequences Total number of disjoint sets is less than or equal to 2 n(h Y H(Y X)) = 2 ni(x;y) Send at most 2 ni(x;y) distinguishable sequences of length n 25

Channel Coding Theorem 26

New Ideas in Shannon s Proof Allowing an arbitrarily small but nonzero probability of error Using the channel many times in succession, so that the law of large numbers comes into effect Calculating the average of the probability of error over a random choice of codebooks symmetrize the probability and can then be used to show the existence of at least one good code Shannon s proof outline was based on idea of typical sequences, but was not made rigorous until much later 27

Current Proof Use the same essential ideas random code selection, calculation of the average probability of error for a random choice of codewords, and so on Main difference is in the decoding rule-decode by joint typicality look for a codeword that is jointly typical with the received sequence if find a unique codeword satisfying this property, declare that word to be the transmitted codeword properties of joint typicality with high probability the transmitted codeword and the received sequence are jointly typical, since they are probabilistically related probability that any other codeword looks jointly typical with the received sequence is 2 ni thus, if we have fewer then 2 ni codewords, then with high probability there will be no other codewords that can be confused with the transmitted codeword, and the probability of error is small 28

SOURCE CHANNEL SEPARATION THEOREM (SHANNON S THIRD THEOREM) 29

Two Main Basic Theorems Data compression: R>H Data transmission: R<C Is condition H < C necessary and sufficient for sending a source over a channel? 30

Example Consider two methods for sending digitized speech over a discrete memoryless channel one-stage method: design a code to map the sequence of speech samples directly into the input of the channel two-stage method: compress the speech into its most efficient representation, then use the appropriate channel code to send it over the channel Lose something by using the two-stage method? data compression does not depend on the channel channel coding does not depend on the source distribution 31

Joint vs. Separate Channel Coding Joint source and channel coding Separate source and channel coding 32

Source Channel Coding Theorem consider the design of a communication system as a combination of two parts source coding: design source codes for the most efficient representation of the data channel coding: design channel codes appropriate for the channel encodes (combat the noise and errors introduced by the channel) the separate encoders can achieve the same rates as the joint encoder hold for the situation where one transmitter communicates to one receiver 33

Summary 34

Summary 35

cuiying@sjtu.edu.cn iwct.sjtu.edu.cn/personal/yingcui 36