Information Theory and Communication Optimal Codes
|
|
- Lisa White
- 6 years ago
- Views:
Transcription
1 Information Theory and Communication Optimal Codes Ritwik Banerjee c Ritwik Banerjee Information Theory and Communication 1/1
2 Roadmap Examples and Types of Codes Kraft Inequality McMillan Inequality Entropy bound on data compression Shannon Code Huffman Code Wrong Code Stochastic and Stationary Processes c Ritwik Banerjee Information Theory and Communication 2/1
3 Wrong Code We have seen that the minimum expected codeword length per symbol is arbitrarily close to the entropy. For stationary stochastic processes, this means that the entropy rate is the expected number of bits per symbol required to describe the process. But what happens if the code is designed for the wrong distribution? Because the true distribution may be unknown to us. The wrong distribution may happen to be the best known approximation of the unknown distribution. Let p(x) and q(x) denote the real and the wrong distributions, resp. Consider the Shannon code assignment, where the minimum expected codeword length is within 1-bit of the theoretical optimal for q(x): l(x) = log 1 q(x) We will not achieve expected length L H(p). c Ritwik Banerjee Information Theory and Communication 3/1
4 Wrong Code What is the increase in the expected description length due to the wrong distribution? c Ritwik Banerjee Information Theory and Communication 4/1
5 Wrong Code What is the increase in the expected description length due to the wrong distribution? This is given by the KL divergence D(p q). Theorem The expected length under p(x) of the code assignment l(x) = satisfies the inequalities H(p) + D(p q) E(l(x)) < H(p) + D(p q) + 1 where the expectation is w.r.t. the distribution p. That is, the divergence is a measure of the penalty of using an approximation for the coding process. log 1 q(x) c Ritwik Banerjee Information Theory and Communication 4/1
6 Code Classes and McMillan Inequality We have shown that instantaneous codes satisfy Kraft inequality. Six years after Kraft s proof, McMillan showed that the same if and only if result also holds true for the larger class of uniquely decodable codes. We will skip the proof of this version of the inequality: The codeword lengths of a uniquely decodable D-ary code satisfy D l i 1 Conversely, given a set of integers l i satisfying this inequality, there exists a uniquely decodable code with these integers as its codeword lengths. c Ritwik Banerjee Information Theory and Communication 5/1
7 Huffman Codes Given a distribution, a prefix code with the shorted expected length (i.e., optimal compression) can be constructed by an algorithm discovered by Huffman in Commonly used for lossless data compression. Let X = {1, 2, 3, 4, 5} and X is a random variable taking values from X with the following distribution: p(1) = 0.25 p(2) = 0.25 p(3) = 0.2 p(4) = 0.15 p(5) = 0.15 c Ritwik Banerjee Information Theory and Communication 6/1
8 Huffman Codes Given a distribution, a prefix code with the shorted expected length (i.e., optimal compression) can be constructed by an algorithm discovered by Huffman in Commonly used for lossless data compression. Let X = {1, 2, 3, 4, 5} and X is a random variable taking values from X with the following distribution: p(1) = 0.25 p(2) = 0.25 p(3) = 0.2 p(4) = 0.15 p(5) = 0.15 The least frequent values should have the longest codewords. The codeword lengths for 4 and 5 must be equal. c Ritwik Banerjee Information Theory and Communication 6/1
9 Huffman Codes Given a distribution, a prefix code with the shorted expected length (i.e., optimal compression) can be constructed by an algorithm discovered by Huffman in Commonly used for lossless data compression. Let X = {1, 2, 3, 4, 5} and X is a random variable taking values from X with the following distribution: p(1) = 0.25 p(2) = 0.25 p(3) = 0.2 p(4) = 0.15 p(5) = 0.15 The least frequent values should have the longest codewords. The codeword lengths for 4 and 5 must be equal. Otherwise, we can delete a bit from the longer codeword, and still retain a prefix-free code; which implies that the code we had was not optimal. Construct a code where the two longest codes differ only in the last bit. Combine 4 and 5 into a single source, with probability 0.3. Keep combining the two least probable items into a single source until there is only one item left. c Ritwik Banerjee Information Theory and Communication 6/1
10 Huffman Codes There is a binary tree structure corresponding to this process! What about other alphabets? In general, for a D-ary alphabet, the algorithm performs a combining of the D least probable items into a single source. For example, if the codewords are in the ternary alphabet D = {1, 2, 3}, we will get a ternary tree. c Ritwik Banerjee Information Theory and Communication 7/1
11 Huffman Codes Insufficient number of symbols Not enough symbols to combine D items at a time? Create dummy variables with 0 probability! At each stage the number of symbols is reduced by D 1. Therefore, for k merges, we want the total number of symbols to be 1 + k(d 1). Optimality of Binary Huffman Codes Theorem For any distribution, there exists an instantaneous code with minimum expected length such that the following properties hold: 1. The codeword lengths are ordered inversely to the probabilities. 2. The two longest codewords have the same length. 3. The longest codewords, which correspond to the least likely symbols, differ only in the last bit. c Ritwik Banerjee Information Theory and Communication 8/1
12 Huffman Code is not unique At each split, we have two choices: 01 or 10 For multiple items tied with identical probabilities, multiple ways ordering their codewords exist. c Ritwik Banerjee Information Theory and Communication 9/1
13 Source Coding and 20-questions We want the optimal seqeunce of yes/no questions to determine an object from a set of objects. Assuming that we know the probability distribution on these objects. A sequence of such questions is equivalent to a code for the object. A question depends only on the answers to the questions asked previously. The sequence of answers uniquely determines the object, therefore, if we model the yes/no answers by 0s and 1s, we have a unique binary encoding for each object in the set. The average length of this code is simply the average number of questions asked. This can be optimized by Huffman code. That is, Huffman code determines the optimal sequence of questions that will identify the object. The expected number of questions in this process, E(Q), satisfies H(X) E(Q) < H(X) + 1 c Ritwik Banerjee Information Theory and Communication 10/1
14 Alphabetic Codes Consider a special case of the 20-questions game: the elements of X = {1, 2,..., m} are in decreasing order of probability (i.e., p 1 p 2... l m) the only questions allowed are of the form is X > a? (for some a). The Huffman code constructed by the Huffman algorithm may not correspond to slice sets of the form {x : x < a}. But we can do the following: Take the optimal code lengths found by Huffman codes. Use these lengths to assign symbols to the tree by taking the first available node at the current level. This is not Huffman code, but it is another optimal code. At each non-leaf node, it splits the set into two subsets: {x : x < a} and {x : x > a} These are also called alphabetic codes because the tree construction process leads to an alphabetical ordering of the codewords. c Ritwik Banerjee Information Theory and Communication 11/1
15 Code Redundancy Huffman code has the shortest average codeword length, i.e., L Huffman L for any prefix code. Redundancy of a random variable X is defined as the difference between the average Huffman codeword length and the entropy H(X). Redundancy of Huffman coding is bounded above by p , where p is the probability of the most common symbol. c Ritwik Banerjee Information Theory and Communication 12/1
16 Huffman coding is optimal Theorem If C is a Huffman code and C is any uniquely decodable code, then L(C ) L(C). The proof for binary alphabets can be extended to general D-ary alphabets. Huffman coding is a greedy algorithm. It works by combining the least likely symbols at each step. In general, greedy approaches do not lead to globally optimal solutions, but in the case of Huffman coding, it does. c Ritwik Banerjee Information Theory and Communication 13/1
Introduction to Source Coding
Comm. 52: Communication Theory Lecture 7 Introduction to Source Coding - Requirements of source codes - Huffman Code Length Fixed Length Variable Length Source Code Properties Uniquely Decodable allow
More informationLecture5: Lossless Compression Techniques
Fixed to fixed mapping: we encoded source symbols of fixed length into fixed length code sequences Fixed to variable mapping: we encoded source symbols of fixed length into variable length code sequences
More informationSolutions to Assignment-2 MOOC-Information Theory
Solutions to Assignment-2 MOOC-Information Theory 1. Which of the following is a prefix-free code? a) 01, 10, 101, 00, 11 b) 0, 11, 01 c) 01, 10, 11, 00 Solution:- The codewords of (a) are not prefix-free
More informationCommunication Theory II
Communication Theory II Lecture 13: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 22 th, 2015 1 o Source Code Generation Lecture Outlines Source Coding
More informationModule 3 Greedy Strategy
Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main
More informationLECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR
1 LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 2 STORAGE SPACE Uncompressed graphics, audio, and video data require substantial storage capacity. Storing uncompressed video is not possible
More informationHuffman Coding - A Greedy Algorithm. Slides based on Kevin Wayne / Pearson-Addison Wesley
- A Greedy Algorithm Slides based on Kevin Wayne / Pearson-Addison Wesley Greedy Algorithms Greedy Algorithms Build up solutions in small steps Make local decisions Previous decisions are never reconsidered
More informationModule 3 Greedy Strategy
Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main
More information# 12 ECE 253a Digital Image Processing Pamela Cosman 11/4/11. Introductory material for image compression
# 2 ECE 253a Digital Image Processing Pamela Cosman /4/ Introductory material for image compression Motivation: Low-resolution color image: 52 52 pixels/color, 24 bits/pixel 3/4 MB 3 2 pixels, 24 bits/pixel
More informationA Brief Introduction to Information Theory and Lossless Coding
A Brief Introduction to Information Theory and Lossless Coding 1 INTRODUCTION This document is intended as a guide to students studying 4C8 who have had no prior exposure to information theory. All of
More informationCOMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam
German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Dr.-Ing. Heiko Schwarz COMM901 Source Coding and Compression Winter Semester
More informationEntropy, Coding and Data Compression
Entropy, Coding and Data Compression Data vs. Information yes, not, yes, yes, not not In ASCII, each item is 3 8 = 24 bits of data But if the only possible answers are yes and not, there is only one bit
More informationInformation Theory and Huffman Coding
Information Theory and Huffman Coding Consider a typical Digital Communication System: A/D Conversion Sampling and Quantization D/A Conversion Source Encoder Source Decoder bit stream bit stream Channel
More information1 This work was partially supported by NSF Grant No. CCR , and by the URI International Engineering Program.
Combined Error Correcting and Compressing Codes Extended Summary Thomas Wenisch Peter F. Swaszek Augustus K. Uht 1 University of Rhode Island, Kingston RI Submitted to International Symposium on Information
More informationModule 8: Video Coding Basics Lecture 40: Need for video coding, Elements of information theory, Lossless coding. The Lecture Contains:
The Lecture Contains: The Need for Video Coding Elements of a Video Coding System Elements of Information Theory Symbol Encoding Run-Length Encoding Entropy Encoding file:///d /...Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2040/40_1.htm[12/31/2015
More informationCommunication Theory II
Communication Theory II Lecture 14: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 25 th, 2015 1 Previous Lecture: Source Code Generation: Lossless
More informationCoding for Efficiency
Let s suppose that, over some channel, we want to transmit text containing only 4 symbols, a, b, c, and d. Further, let s suppose they have a probability of occurrence in any block of text we send as follows
More informationThe Lempel-Ziv (LZ) lossless compression algorithm was developed by Jacob Ziv (AT&T Bell Labs / Technion Israel) and Abraham Lempel (IBM) in 1978;
Georgia Institute of Technology - Georgia Tech Lorraine ECE 6605 Information Theory Lempel-Ziv Lossless Compresion General comments The Lempel-Ziv (LZ) lossless compression algorithm was developed by Jacob
More informationMultimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology
Course Presentation Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology Data Compression Motivation Data storage and transmission cost money Use fewest number of
More informationChapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication
1 Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING 1.1 SOURCE CODING Whether a source is analog or digital, a digital communication system is designed to transmit information in digital form.
More informationSOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON).
SOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON). 1. Some easy problems. 1.1. Guessing a number. Someone chose a number x between 1 and N. You are allowed to ask questions: Is this number larger
More informationUCSD ECE154C Handout #21 Prof. Young-Han Kim Thursday, April 28, Midterm Solutions (Prepared by TA Shouvik Ganguly)
UCSD ECE54C Handout #2 Prof. Young-Han Kim Thursday, April 28, 26 Midterm Solutions (Prepared by TA Shouvik Ganguly) There are 3 problems, each problem with multiple parts, each part worth points. Your
More informationComputing and Communications 2. Information Theory -Channel Capacity
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Channel Capacity Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Communication
More informationGENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE
GENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE Wook-Hyun Jeong and Yo-Sung Ho Kwangju Institute of Science and Technology (K-JIST) Oryong-dong, Buk-gu, Kwangju,
More informationComm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding
Comm. 50: Communication Theory Lecture 6 - Introduction to Source Coding Digital Communication Systems Source of Information User of Information Source Encoder Source Decoder Channel Encoder Channel Decoder
More informationECE Advanced Communication Theory, Spring 2007 Midterm Exam Monday, April 23rd, 6:00-9:00pm, ELAB 325
C 745 - Advanced Communication Theory, Spring 2007 Midterm xam Monday, April 23rd, 600-900pm, LAB 325 Overview The exam consists of five problems for 150 points. The points for each part of each problem
More informationGreedy Algorithms. Kleinberg and Tardos, Chapter 4
Greedy Algorithms Kleinberg and Tardos, Chapter 4 1 Selecting gas stations Road trip from Fort Collins to Durango on a given route with length L, and fuel stations at positions b i. Fuel capacity = C miles.
More informationScheduling in omnidirectional relay wireless networks
Scheduling in omnidirectional relay wireless networks by Shuning Wang A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of Applied Science
More informationRab Nawaz. Prof. Zhang Wenyi
Rab Nawaz PhD Scholar (BL16006002) School of Information Science and Technology University of Science and Technology of China, Hefei Email: rabnawaz@mail.ustc.edu.cn Submitted to Prof. Zhang Wenyi wenyizha@ustc.edu.cn
More information6.450: Principles of Digital Communication 1
6.450: Principles of Digital Communication 1 Digital Communication: Enormous and normally rapidly growing industry, roughly comparable in size to the computer industry. Objective: Study those aspects of
More informationOutline. Communications Engineering 1
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband channels Signal space representation Optimal
More informationDCSP-3: Minimal Length Coding. Jianfeng Feng
DCSP-3: Minimal Length Coding Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html Automatic Image Caption (better than
More informationThe ternary alphabet is used by alternate mark inversion modulation; successive ones in data are represented by alternating ±1.
Alphabets EE 387, Notes 2, Handout #3 Definition: An alphabet is a discrete (usually finite) set of symbols. Examples: B = {0,1} is the binary alphabet T = { 1,0,+1} is the ternary alphabet X = {00,01,...,FF}
More informationcode V(n,k) := words module
Basic Theory Distance Suppose that you knew that an English word was transmitted and you had received the word SHIP. If you suspected that some errors had occurred in transmission, it would be impossible
More informationCHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES
119 CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES 5.1 INTRODUCTION In this work the peak powers of the OFDM signal is reduced by applying Adaptive Huffman Codes (AHC). First the encoding
More informationSYLLABUS of the course BASIC ELECTRONICS AND DIGITAL SIGNAL PROCESSING. Master in Computer Science, University of Bolzano-Bozen, a.y.
SYLLABUS of the course BASIC ELECTRONICS AND DIGITAL SIGNAL PROCESSING Master in Computer Science, University of Bolzano-Bozen, a.y. 2017-2018 Lecturer: LEONARDO RICCI (last updated on November 27, 2017)
More informationECEn 665: Antennas and Propagation for Wireless Communications 131. s(t) = A c [1 + αm(t)] cos (ω c t) (9.27)
ECEn 665: Antennas and Propagation for Wireless Communications 131 9. Modulation Modulation is a way to vary the amplitude and phase of a sinusoidal carrier waveform in order to transmit information. When
More informationCSE 100: BST AVERAGE CASE AND HUFFMAN CODES
CSE 100: BST AVERAGE CASE AND HUFFMAN CODES Recap: Average Case Analysis of successful find in a BST N nodes Expected total depth of all BSTs with N nodes Recap: Probability of having i nodes in the left
More informationSHANNON S source channel separation theorem states
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 9, SEPTEMBER 2009 3927 Source Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Member, IEEE, Elza Erkip, Senior Member,
More informationDigital Communication Systems ECS 452
Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 2. Source Coding 1 Office Hours: BKD, 6th floor of Sirindhralai building Monday 10:00-10:40 Tuesday 12:00-12:40
More informationLecture 1 Introduction
Lecture 1 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 22, 2015 1 / 46 I-Hsiang Wang IT Lecture 1 Information Theory Information
More informationTHE use of balanced codes is crucial for some information
A Construction for Balancing Non-Binary Sequences Based on Gray Code Prefixes Elie N. Mambou and Theo G. Swart, Senior Member, IEEE arxiv:70.008v [cs.it] Jun 07 Abstract We introduce a new construction
More informationInfo theory and big data
Info theory and big data Typical or not typical, that is the question Han Vinck University Duisburg Essen, Germany September 2016 A.J. Han Vinck, Yerevan, September 2016 Content: big data issues A definition:
More informationThe idea of similarity is through the Hamming
Hamming distance A good channel code is designed so that, if a few bit errors occur in transmission, the output can still be identified as the correct input. This is possible because although incorrect,
More informationCOPYRIGHTED MATERIAL. Introduction. 1.1 Communication Systems
1 Introduction The reliable transmission of information over noisy channels is one of the basic requirements of digital information and communication systems. Here, transmission is understood both as transmission
More informationLecture Notes 3: Paging, K-Server and Metric Spaces
Online Algorithms 16/11/11 Lecture Notes 3: Paging, K-Server and Metric Spaces Professor: Yossi Azar Scribe:Maor Dan 1 Introduction This lecture covers the Paging problem. We present a competitive online
More informationIMPERIAL COLLEGE of SCIENCE, TECHNOLOGY and MEDICINE, DEPARTMENT of ELECTRICAL and ELECTRONIC ENGINEERING.
IMPERIAL COLLEGE of SCIENCE, TECHNOLOGY and MEDICINE, DEPARTMENT of ELECTRICAL and ELECTRONIC ENGINEERING. COMPACT LECTURE NOTES on COMMUNICATION THEORY. Prof. Athanassios Manikas, version Spring 22 Digital
More informationSlides credited from Hsueh-I Lu, Hsu-Chun Hsiao, & Michael Tsai
Slides credited from Hsueh-I Lu, Hsu-Chun Hsiao, & Michael Tsai Mini-HW 6 Released Due on 11/09 (Thu) 17:20 Homework 2 Due on 11/09 (Thur) 17:20 Midterm Time: 11/16 (Thur) 14:20-17:20 Format: close book
More informationCOURSE MATERIAL Subject Name: Communication Theory UNIT V
NH-67, TRICHY MAIN ROAD, PULIYUR, C.F. - 639114, KARUR DT. DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING COURSE MATERIAL Subject Name: Communication Theory Subject Code: 080290020 Class/Sem:
More informationIntroduction to Coding Theory
Coding Theory Massoud Malek Introduction to Coding Theory Introduction. Coding theory originated with the advent of computers. Early computers were huge mechanical monsters whose reliability was low compared
More informationCourse Developer: Ranjan Bose, IIT Delhi
Course Title: Coding Theory Course Developer: Ranjan Bose, IIT Delhi Part I Information Theory and Source Coding 1. Source Coding 1.1. Introduction to Information Theory 1.2. Uncertainty and Information
More informationDEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS
DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK Subject Name: Year /Sem: II / IV UNIT I INFORMATION ENTROPY FUNDAMENTALS PART A (2 MARKS) 1. What is uncertainty? 2. What is prefix coding? 3. State the
More informationBackground Dirty Paper Coding Codeword Binning Code construction Remaining problems. Information Hiding. Phil Regalia
Information Hiding Phil Regalia Department of Electrical Engineering and Computer Science Catholic University of America Washington, DC 20064 regalia@cua.edu Baltimore IEEE Signal Processing Society Chapter,
More informationRun-Length Based Huffman Coding
Chapter 5 Run-Length Based Huffman Coding This chapter presents a multistage encoding technique to reduce the test data volume and test power in scan-based test applications. We have proposed a statistical
More informationHuffman Coding For Digital Photography
Huffman Coding For Digital Photography Raydhitya Yoseph 13509092 Program Studi Teknik Informatika Sekolah Teknik Elektro dan Informatika Institut Teknologi Bandung, Jl. Ganesha 10 Bandung 40132, Indonesia
More informationDigital Communications I: Modulation and Coding Course. Term Catharina Logothetis Lecture 12
Digital Communications I: Modulation and Coding Course Term 3-8 Catharina Logothetis Lecture Last time, we talked about: How decoding is performed for Convolutional codes? What is a Maximum likelihood
More informationDigital Television Lecture 5
Digital Television Lecture 5 Forward Error Correction (FEC) Åbo Akademi University Domkyrkotorget 5 Åbo 8.4. Error Correction in Transmissions Need for error correction in transmissions Loss of data during
More informationCOMP Online Algorithms. Paging and k-server Problem. Shahin Kamali. Lecture 11 - Oct. 11, 2018 University of Manitoba
COMP 7720 - Online Algorithms Paging and k-server Problem Shahin Kamali Lecture 11 - Oct. 11, 2018 University of Manitoba COMP 7720 - Online Algorithms Paging and k-server Problem 1 / 19 Review & Plan
More information5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010
5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Interference Channels With Correlated Receiver Side Information Nan Liu, Member, IEEE, Deniz Gündüz, Member, IEEE, Andrea J.
More informationMonday, February 2, Is assigned today. Answers due by noon on Monday, February 9, 2015.
Monday, February 2, 2015 Topics for today Homework #1 Encoding checkers and chess positions Constructing variable-length codes Huffman codes Homework #1 Is assigned today. Answers due by noon on Monday,
More informationKINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK
KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK SUB.NAME : COMMUNICATION THEORY SUB.CODE: EC1252 YEAR : II SEMESTER : IV UNIT I AMPLITUDE MODULATION SYSTEMS
More informationWednesday, February 1, 2017
Wednesday, February 1, 2017 Topics for today Encoding game positions Constructing variable-length codes Huffman codes Encoding Game positions Some programs that play two-player games (e.g., tic-tac-toe,
More informationSynchronization using Insertion/Deletion Correcting Permutation Codes
Synchronization using Insertion/Deletion Correcting Permutation Codes Ling Cheng, Theo G. Swart and Hendrik C. Ferreira Department of Electrical and Electronic Engineering Science University of Johannesburg,
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 6, JUNE
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 6, JUNE 2009 2659 Rank Modulation for Flash Memories Anxiao (Andrew) Jiang, Member, IEEE, Robert Mateescu, Member, IEEE, Moshe Schwartz, Member, IEEE,
More informationMAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007
MIT OpenCourseWare http://ocw.mit.edu MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007 For information about citing these materials or our Terms of Use, visit:
More informationMAS160: Signals, Systems & Information for Media Technology. Problem Set 4. DUE: October 20, 2003
MAS160: Signals, Systems & Information for Media Technology Problem Set 4 DUE: October 20, 2003 Instructors: V. Michael Bove, Jr. and Rosalind Picard T.A. Jim McBride Problem 1: Simple Psychoacoustic Masking
More informationCoding Techniques and the Two-Access Channel
Coding Techniques and the Two-Access Channel A.J. Han VINCK Institute for Experimental Mathematics, University of Duisburg-Essen, Germany email: Vinck@exp-math.uni-essen.de Abstract. We consider some examples
More informationSpeech Coding in the Frequency Domain
Speech Coding in the Frequency Domain Speech Processing Advanced Topics Tom Bäckström Aalto University October 215 Introduction The speech production model can be used to efficiently encode speech signals.
More informationA High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction
1514 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 10, NO. 8, DECEMBER 2000 A High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction Bai-Jue Shieh, Yew-San Lee,
More informationFAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING
FAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING Harman Jot, Rupinder Kaur M.Tech, Department of Electronics and Communication, Punjabi University, Patiala, Punjab, India I. INTRODUCTION
More informationMultitree Decoding and Multitree-Aided LDPC Decoding
Multitree Decoding and Multitree-Aided LDPC Decoding Maja Ostojic and Hans-Andrea Loeliger Dept. of Information Technology and Electrical Engineering ETH Zurich, Switzerland Email: {ostojic,loeliger}@isi.ee.ethz.ch
More information2.1. General Purpose Run Length Encoding Relative Encoding Tokanization or Pattern Substitution
2.1. General Purpose There are many popular general purpose lossless compression techniques, that can be applied to any type of data. 2.1.1. Run Length Encoding Run Length Encoding is a compression technique
More informationPROJECT 5: DESIGNING A VOICE MODEM. Instructor: Amir Asif
PROJECT 5: DESIGNING A VOICE MODEM Instructor: Amir Asif CSE4214: Digital Communications (Fall 2012) Computer Science and Engineering, York University 1. PURPOSE In this laboratory project, you will design
More informationLecture 9b Convolutional Coding/Decoding and Trellis Code modulation
Lecture 9b Convolutional Coding/Decoding and Trellis Code modulation Convolutional Coder Basics Coder State Diagram Encoder Trellis Coder Tree Viterbi Decoding For Simplicity assume Binary Sym.Channel
More informationChapter 3 Convolutional Codes and Trellis Coded Modulation
Chapter 3 Convolutional Codes and Trellis Coded Modulation 3. Encoder Structure and Trellis Representation 3. Systematic Convolutional Codes 3.3 Viterbi Decoding Algorithm 3.4 BCJR Decoding Algorithm 3.5
More informationHamming net based Low Complexity Successive Cancellation Polar Decoder
Hamming net based Low Complexity Successive Cancellation Polar Decoder [1] Makarand Jadhav, [2] Dr. Ashok Sapkal, [3] Prof. Ram Patterkine [1] Ph.D. Student, [2] Professor, Government COE, Pune, [3] Ex-Head
More informationVariable-Length. Error-Correcting Codes
Variable-Length Error-Correcting Codes A thesis submitted to the University of Manchester for the degree of Doctor of Philosophy in the Faculty of Science 1995 Victor Buttigieg Department of Electrical
More informationS Coding Methods (5 cr) P. Prerequisites. Literature (1) Contents
S-72.3410 Introduction 1 S-72.3410 Introduction 3 S-72.3410 Coding Methods (5 cr) P Lectures: Mondays 9 12, room E110, and Wednesdays 9 12, hall S4 (on January 30th this lecture will be held in E111!)
More informationSets. Gazihan Alankuş (Based on original slides by Brahim Hnich et al.) August 6, Outline Sets Equality Subset Empty Set Cardinality Power Set
Gazihan Alankuş (Based on original slides by Brahim Hnich et al.) August 6, 2012 Gazihan Alankuş (Based on original slides by Brahim Hnich et al.) Gazihan Alankuş (Based on original slides by Brahim Hnich
More informationEntropy Coding. Outline. Entropy. Definitions. log. A = {a, b, c, d, e}
Outline efinition of ntroy Three ntroy coding techniques: Huffman coding rithmetic coding Lemel-Ziv coding ntroy oding (taken from the Technion) ntroy ntroy of a set of elements e,,e n with robabilities,
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Encoded NW Decoders John E Savage Lecture Outline Encoded NW Decoders Axial and radial encoding Addressing Strategies All different, Most different, All present,
More informationPart A: Question & Answers UNIT I AMPLITUDE MODULATION
PANDIAN SARASWATHI YADAV ENGINEERING COLLEGE DEPARTMENT OF ELECTRONICS & COMMUNICATON ENGG. Branch: ECE EC6402 COMMUNICATION THEORY Semester: IV Part A: Question & Answers UNIT I AMPLITUDE MODULATION 1.
More informationOptimal Coded Information Network Design and Management via Improved Characterizations of the Binary Entropy Function
Optimal Coded Information Network Design and Management via Improved Characterizations of the Binary Entropy Function John MacLaren Walsh & Steven Weber Department of Electrical and Computer Engineering
More informationOn the Capacity Region of the Vector Fading Broadcast Channel with no CSIT
On the Capacity Region of the Vector Fading Broadcast Channel with no CSIT Syed Ali Jafar University of California Irvine Irvine, CA 92697-2625 Email: syed@uciedu Andrea Goldsmith Stanford University Stanford,
More informationReview: Our Approach 2. CSC310 Information Theory
CSC30 Informaton Theory Sam Rowes Lecture 3: Provng the Kraft-McMllan Inequaltes September 8, 6 Revew: Our Approach The study of both compresson and transmsson requres that we abstract data and messages
More informationCSE 573 Problem Set 1. Answers on 10/17/08
CSE 573 Problem Set. Answers on 0/7/08 Please work on this problem set individually. (Subsequent problem sets may allow group discussion. If any problem doesn t contain enough information for you to answer
More informationPROBABILITY AND STATISTICS Vol. II - Information Theory and Communication - Tibor Nemetz INFORMATION THEORY AND COMMUNICATION
INFORMATION THEORY AND COMMUNICATION Tibor Nemetz Rényi Mathematical Institute, Hungarian Academy of Sciences, Budapest, Hungary Keywords: Shannon theory, alphabet, capacity, (transmission) channel, channel
More informationModulation and Coding Tradeoffs
0 Modulation and Coding Tradeoffs Contents 1 1. Design Goals 2. Error Probability Plane 3. Nyquist Minimum Bandwidth 4. Shannon Hartley Capacity Theorem 5. Bandwidth Efficiency Plane 6. Modulation and
More informationTarek M. Sobh and Tarek Alameldin
Operator/System Communication : An Optimizing Decision Tool Tarek M. Sobh and Tarek Alameldin Department of Computer and Information Science School of Engineering and Applied Science University of Pennsylvania,
More informationSymbol-by-Symbol MAP Decoding of Variable Length Codes
Symbol-by-Symbol MA Decoding of Variable Length Codes Rainer Bauer and Joachim Hagenauer Institute for Communications Engineering (LNT) Munich University of Technology (TUM) e-mail: Rainer.Bauer@ei.tum.de,
More informationHamming Codes as Error-Reducing Codes
Hamming Codes as Error-Reducing Codes William Rurik Arya Mazumdar Abstract Hamming codes are the first nontrivial family of error-correcting codes that can correct one error in a block of binary symbols.
More informationOn Optimum Communication Cost for Joint Compression and Dispersive Information Routing
2010 IEEE Information Theory Workshop - ITW 2010 Dublin On Optimum Communication Cost for Joint Compression and Dispersive Information Routing Kumar Viswanatha, Emrah Akyol and Kenneth Rose Department
More informationB. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided.
" 11111111111111111111111111111111111111111111111111111111111111III *U-3091/8400* Printed Pages : 7 TEC - 601! I i B. Tech. (SEM. VI) EXAMINATION, 2007-08 DIGIT AL COMMUNICATION \ V Time: 3 Hours] [Total
More informationPublished by: PIONEER RESEARCH & DEVELOPMENT GROUP ( 1
VHDL design of lossy DWT based image compression technique for video conferencing Anitha Mary. M 1 and Dr.N.M. Nandhitha 2 1 VLSI Design, Sathyabama University Chennai, Tamilnadu 600119, India 2 ECE, Sathyabama
More informationDEGRADED broadcast channels were first studied by
4296 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 9, SEPTEMBER 2008 Optimal Transmission Strategy Explicit Capacity Region for Broadcast Z Channels Bike Xie, Student Member, IEEE, Miguel Griot,
More informationBlock Markov Encoding & Decoding
1 Block Markov Encoding & Decoding Deqiang Chen I. INTRODUCTION Various Markov encoding and decoding techniques are often proposed for specific channels, e.g., the multi-access channel (MAC) with feedback,
More informationCapacity-Approaching Bandwidth-Efficient Coded Modulation Schemes Based on Low-Density Parity-Check Codes
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 9, SEPTEMBER 2003 2141 Capacity-Approaching Bandwidth-Efficient Coded Modulation Schemes Based on Low-Density Parity-Check Codes Jilei Hou, Student
More informationConnected Identifying Codes
Connected Identifying Codes Niloofar Fazlollahi, David Starobinski and Ari Trachtenberg Dept. of Electrical and Computer Engineering Boston University, Boston, MA 02215 Email: {nfazl,staro,trachten}@bu.edu
More informationEELE 6333: Wireless Commuications
EELE 6333: Wireless Commuications Chapter # 4 : Capacity of Wireless Channels Spring, 2012/2013 EELE 6333: Wireless Commuications - Ch.4 Dr. Musbah Shaat 1 / 18 Outline 1 Capacity in AWGN 2 Capacity of
More informationInformation Hiding: Steganography & Steganalysis
Information Hiding: Steganography & Steganalysis 1 Steganography ( covered writing ) From Herodotus to Thatcher. Messages should be undetectable. Messages concealed in media files. Perceptually insignificant
More information