Solutions to Assignment-2 MOOC-Information Theory

Size: px
Start display at page:

Download "Solutions to Assignment-2 MOOC-Information Theory"

Transcription

1 Solutions to Assignment-2 MOOC-Information Theory 1. Which of the following is a prefix-free code? a) 01, 10, 101, 00, 11 b) 0, 11, 01 c) 01, 10, 11, 00 Solution:- The codewords of (a) are not prefix-free as the codeword 10 is clearly a prefix of 101. Similarly in (b), the codeword 0 is prefix of the codeword 01. The codewords of (c) are prefix-free as no codeword is a prefix of another codeword. Hence the correct answer is (c). 2. Which of the following is a uniquely decodable code but not prefix-free? a) 01, 10, 101, 00, 11 b) 0, 11, 01 c) 01, 10, 11, 00 Solution:- As explained in solution (1), codes (a) and (b) are not prefix-free. The unique decodability of the code can be tested as below (a) 01, 10, 101, 00, 11 S 0 S 1 S 2 S 3 S

2 Since, the codeword 01 is present in both S 0 and S 2 (S 3, S 4 and so on). Hence, it is not uniquely decodable. (b) 0, 11, 01 S 0 S 1 S 2 S Here, none of the codewords are present in other columns. Hence, it is a uniquely decodable code. The correct answer is (b). 3. Given a binary variable length code consisting of codewords of following length w 1 = 1, w 2 = 2, w 3 = 2, w 4 = 3. Which of the following statement is correct? a) There exist a prefix-free code of given lengths b) There exist many prefix-free codes of given lengths c) There exist a uniquely decodable code of given lengths Solution:- The criteria for existence of a prefix-free code is given by Kraft s inequality K D wi 1 i=1 Here, K = 4 and D = 2. Substituting the values of w i s, we get 4 i=1 D wi = which is greater than 1 and hence no prefix-free or uniquely decodable code exist. So, the correct answer is (d). 4. A discrete memoryless source emits a symbol U that takes five different values u 1, u 2, u 3, u 4, u 5 with probabilities 0.25, 0.25, 0.25, 0.125, respectively. A binary Shannon-Fano code consists of codewords of following lengths a) 3, 3, 3, 4, 4 b) 2, 2, 2, 3, 3 c) 1, 1, 1, 2, 2 2

3 Solution:- The length, w i of the codeword u i is given as w i = log D (P U (u i )) Here, D = 2 (binary codes). So, substituting the respective probabilities in the above equation we get the lengths of codewords as 2, 2, 2, 3, 3. Hence, correct answer is (b). 5. Given a discrete memoryless source that emits a symbol U, that takes five different values u 1, u 2, u 3, u 4, u 5 with probabilities 0.25, 0.25, 0.25, 0.125, respectively. An optimal ternary prefix-free code is designed to represent U. What is the expected length of the code? a) 1.4 b) 1.5 c) 2.25 Solution:- Here, D = 3 (Ternary Code) and K = 5. So, the number of the unused leaves = Remainder( (K D)(D 2) D 1 ) = 0. So, the tree diagram of the optimal ternary prefix-code is represented as below. Hence, the expected length of the code using path length lemma is given as E[W ] = = 1.5 bits (Probabilities of the nodes is directed in red). Thus, the correct answer is (b). 3

4 6. Given a discrete memoryless source that emits 6-ary random variable U, {u 1, u 2, u 3, u 4, u 5, u 6 } with probabilities {0.25, 0.25, 0.125, 0.125, 0.125, 0.125} respectively. Which of the following is not an optimal binary prefix-free code for this source? (Prefix-free code is represented by binary tree.) Figure 1: 6(a) Figure 2: 6(b) Figure 3: 6(c) Figure 4: 6(d) Solution:- It can be easily checked that all the above codes are prefix-free. Now, the optimal code is one in which only the least likely (i.e, less probable) leafs/nodes are connected. In (a), we observe that at each depth only least likely leafs/nodes are joined. So, (a) is an optimal code. Similarly, we can observe in (c) and (d) that the least likely leafs/nodes are only connected at each depth of the tree. Now, in (b) we observe that the code tree has connected a node with probability 0.5 with another node with probability 0.25 at depth level 1, while another node with probability 0.25 is left to be connected at the root. Hence, the code in (b) is not optimal. Thus, the correct answer is (b). 4

5 7. Given a discrete memoryless source that emits 6-ary random variable U, {u 1, u 2, u 3, u 4, u 5, u 6 } with probabilities {0.25, 0.25, 0.125, 0.125, 0.125, 0.125} respectively. Which of the following is an optimal ternary prefix-free code for this source? Figure 5: 7(a) Figure 6: 7(b) Figure 7: 7(c) 7( Solution:- It can be easily checked that all the above codes are prefix-free. Here, D = 3 (ternary codes) and K = 6. Now, the number of the unused leaves = Remainder( (K D)(D 2) D 1 ) = 1. Now, an optimal code should have any unused leaves only at the maximum depth of the D-ary tree. Since, the code in (b) is of depth 2 but has a missing leaf at depth 1. So, (b) can t be an optimal code. Similarly, code in (c) is of depth 3 but also has a missing leaf at depth 1 from the root. So, (c) can t be an optimal code. Now, the code in (a) has a missing leaf at the maximum depth. Hence, the code shown in (a) is optimal. Thus, the correct answer is (a). 5

6 8. Given a discrete memoryless source that emits a symbol U, that takes four different values u 1, u 2, u 3, u 4 with probabilities 0.2, 0.2, 0.3, 0.3 respectively. Which of the following is an optimal binary block to variable length code for this source but not Huffman code. Codewords of u 1, u 2, u 3, u 4 are respectively? a) 00, 01, 10, 11 b) 00, 11, 01, 10 c) 0, 10, 110, 111 d) 111, 110, 10, 0 Figure 8: 8(a) Figure 9: 8(b) Figure 10: 8(c) Figure 11: 8(d) Solution:- The expected length of the codes by path length lemma for (a)=2, (b)=2, (c)=2.4 and (d)=2.1. Thus, the codewords of (c) and (d) are not optimal. The code in (a) is a Huffman code as the two least likely codewords of (a) differs by 1 bit only (i.e 00, 01). Whereas the code in (b), although being optimal has it s two least likely codewords differ by 2 bits (i.e 00, 11). Hence, the code in (b) is an optimal, non-huffman code. Thus, the correct answer is (b). 6

7 9. Consider a bag containing 100 balls of red and blue colors. Let, X i = 1 or 0 depending upon whether the i-th ball is of red color or blue color. Let, X i be independent with Pr{X i = 1} = You are asked to find the set of all red balls. Any yes/no question is admissible. What is the minimum average number of questions required to identify the set of red balls? a) 80 b) 81 c) 82 Solution:- Since, it s a binary based setup. So, Pr{X i = 0} = The minimum average number of questions required to identify the set of red balls is given as N = H(X 1, X 2,..., X 100 ) Since, the X i s are independent so we have N = i=100 i=1 H(X i ) where H(X i ) = Pr{X i = 0} log 2 (Pr{X i = 0}) Pr{X i = 1} log 2 (Pr{X i = 1}) = 1 4 log log = N = Hence, the correct answer is (c). 10. Which of the following cannot be a valid Huffman code? a) 01, 10 b) 0, 10, 11 c) 01, 10, 11, 00 Solution:- The binary tree diagram is shown below. The unused leafs of each tree is shown by an empty circle, while the used leafs are shown with dotted circle. The codewords of (a) is shown to have two unused leaves and thus the codeword length can be shortened as shown in the tree diagram with the 7

8 Figure 12: 10(a) Figure 13: 10(b) Figure 14: 10(c) 10( red arrow. Hence, the codewords of (a) cannot be of a Huffman code since, it is not optimal. While, the codewords of (b) and (c) are shown to be using all the leaves and thus are valid Huffman codewords. Hence, correct answer is (a). 8

Information Theory and Communication Optimal Codes

Information Theory and Communication Optimal Codes Information Theory and Communication Optimal Codes Ritwik Banerjee rbanerjee@cs.stonybrook.edu c Ritwik Banerjee Information Theory and Communication 1/1 Roadmap Examples and Types of Codes Kraft Inequality

More information

Lecture5: Lossless Compression Techniques

Lecture5: Lossless Compression Techniques Fixed to fixed mapping: we encoded source symbols of fixed length into fixed length code sequences Fixed to variable mapping: we encoded source symbols of fixed length into variable length code sequences

More information

LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR

LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 1 LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 2 STORAGE SPACE Uncompressed graphics, audio, and video data require substantial storage capacity. Storing uncompressed video is not possible

More information

Introduction to Source Coding

Introduction to Source Coding Comm. 52: Communication Theory Lecture 7 Introduction to Source Coding - Requirements of source codes - Huffman Code Length Fixed Length Variable Length Source Code Properties Uniquely Decodable allow

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 13: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 22 th, 2015 1 o Source Code Generation Lecture Outlines Source Coding

More information

A Brief Introduction to Information Theory and Lossless Coding

A Brief Introduction to Information Theory and Lossless Coding A Brief Introduction to Information Theory and Lossless Coding 1 INTRODUCTION This document is intended as a guide to students studying 4C8 who have had no prior exposure to information theory. All of

More information

Coding for Efficiency

Coding for Efficiency Let s suppose that, over some channel, we want to transmit text containing only 4 symbols, a, b, c, and d. Further, let s suppose they have a probability of occurrence in any block of text we send as follows

More information

# 12 ECE 253a Digital Image Processing Pamela Cosman 11/4/11. Introductory material for image compression

# 12 ECE 253a Digital Image Processing Pamela Cosman 11/4/11. Introductory material for image compression # 2 ECE 253a Digital Image Processing Pamela Cosman /4/ Introductory material for image compression Motivation: Low-resolution color image: 52 52 pixels/color, 24 bits/pixel 3/4 MB 3 2 pixels, 24 bits/pixel

More information

Entropy, Coding and Data Compression

Entropy, Coding and Data Compression Entropy, Coding and Data Compression Data vs. Information yes, not, yes, yes, not not In ASCII, each item is 3 8 = 24 bits of data But if the only possible answers are yes and not, there is only one bit

More information

Module 3 Greedy Strategy

Module 3 Greedy Strategy Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main

More information

Module 3 Greedy Strategy

Module 3 Greedy Strategy Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main

More information

UCSD ECE154C Handout #21 Prof. Young-Han Kim Thursday, April 28, Midterm Solutions (Prepared by TA Shouvik Ganguly)

UCSD ECE154C Handout #21 Prof. Young-Han Kim Thursday, April 28, Midterm Solutions (Prepared by TA Shouvik Ganguly) UCSD ECE54C Handout #2 Prof. Young-Han Kim Thursday, April 28, 26 Midterm Solutions (Prepared by TA Shouvik Ganguly) There are 3 problems, each problem with multiple parts, each part worth points. Your

More information

1 This work was partially supported by NSF Grant No. CCR , and by the URI International Engineering Program.

1 This work was partially supported by NSF Grant No. CCR , and by the URI International Engineering Program. Combined Error Correcting and Compressing Codes Extended Summary Thomas Wenisch Peter F. Swaszek Augustus K. Uht 1 University of Rhode Island, Kingston RI Submitted to International Symposium on Information

More information

Huffman Coding - A Greedy Algorithm. Slides based on Kevin Wayne / Pearson-Addison Wesley

Huffman Coding - A Greedy Algorithm. Slides based on Kevin Wayne / Pearson-Addison Wesley - A Greedy Algorithm Slides based on Kevin Wayne / Pearson-Addison Wesley Greedy Algorithms Greedy Algorithms Build up solutions in small steps Make local decisions Previous decisions are never reconsidered

More information

6.450: Principles of Digital Communication 1

6.450: Principles of Digital Communication 1 6.450: Principles of Digital Communication 1 Digital Communication: Enormous and normally rapidly growing industry, roughly comparable in size to the computer industry. Objective: Study those aspects of

More information

COMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam

COMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Dr.-Ing. Heiko Schwarz COMM901 Source Coding and Compression Winter Semester

More information

Module 8: Video Coding Basics Lecture 40: Need for video coding, Elements of information theory, Lossless coding. The Lecture Contains:

Module 8: Video Coding Basics Lecture 40: Need for video coding, Elements of information theory, Lossless coding. The Lecture Contains: The Lecture Contains: The Need for Video Coding Elements of a Video Coding System Elements of Information Theory Symbol Encoding Run-Length Encoding Entropy Encoding file:///d /...Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2040/40_1.htm[12/31/2015

More information

Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication

Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication 1 Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING 1.1 SOURCE CODING Whether a source is analog or digital, a digital communication system is designed to transmit information in digital form.

More information

CSE 100: BST AVERAGE CASE AND HUFFMAN CODES

CSE 100: BST AVERAGE CASE AND HUFFMAN CODES CSE 100: BST AVERAGE CASE AND HUFFMAN CODES Recap: Average Case Analysis of successful find in a BST N nodes Expected total depth of all BSTs with N nodes Recap: Probability of having i nodes in the left

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 14: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 25 th, 2015 1 Previous Lecture: Source Code Generation: Lossless

More information

Information Theory and Huffman Coding

Information Theory and Huffman Coding Information Theory and Huffman Coding Consider a typical Digital Communication System: A/D Conversion Sampling and Quantization D/A Conversion Source Encoder Source Decoder bit stream bit stream Channel

More information

Lecture 13 February 23

Lecture 13 February 23 EE/Stats 376A: Information theory Winter 2017 Lecture 13 February 23 Lecturer: David Tse Scribe: David L, Tong M, Vivek B 13.1 Outline olar Codes 13.1.1 Reading CT: 8.1, 8.3 8.6, 9.1, 9.2 13.2 Recap -

More information

Wednesday, February 1, 2017

Wednesday, February 1, 2017 Wednesday, February 1, 2017 Topics for today Encoding game positions Constructing variable-length codes Huffman codes Encoding Game positions Some programs that play two-player games (e.g., tic-tac-toe,

More information

The idea of similarity is through the Hamming

The idea of similarity is through the Hamming Hamming distance A good channel code is designed so that, if a few bit errors occur in transmission, the output can still be identified as the correct input. This is possible because although incorrect,

More information

B. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided.

B. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided. " 11111111111111111111111111111111111111111111111111111111111111III *U-3091/8400* Printed Pages : 7 TEC - 601! I i B. Tech. (SEM. VI) EXAMINATION, 2007-08 DIGIT AL COMMUNICATION \ V Time: 3 Hours] [Total

More information

mywbut.com Two agent games : alpha beta pruning

mywbut.com Two agent games : alpha beta pruning Two agent games : alpha beta pruning 1 3.5 Alpha-Beta Pruning ALPHA-BETA pruning is a method that reduces the number of nodes explored in Minimax strategy. It reduces the time required for the search and

More information

Monday, February 2, Is assigned today. Answers due by noon on Monday, February 9, 2015.

Monday, February 2, Is assigned today. Answers due by noon on Monday, February 9, 2015. Monday, February 2, 2015 Topics for today Homework #1 Encoding checkers and chess positions Constructing variable-length codes Huffman codes Homework #1 Is assigned today. Answers due by noon on Monday,

More information

COPYRIGHTED MATERIAL. Introduction. 1.1 Communication Systems

COPYRIGHTED MATERIAL. Introduction. 1.1 Communication Systems 1 Introduction The reliable transmission of information over noisy channels is one of the basic requirements of digital information and communication systems. Here, transmission is understood both as transmission

More information

4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix

4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix Send SMS s : ONJntuSpeed To 9870807070 To Recieve Jntu Updates Daily On Your Mobile For Free www.strikingsoon.comjntu ONLINE EXMINTIONS [Mid 2 - dc] http://jntuk.strikingsoon.com 1. Two binary random

More information

DCSP-3: Minimal Length Coding. Jianfeng Feng

DCSP-3: Minimal Length Coding. Jianfeng Feng DCSP-3: Minimal Length Coding Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html Automatic Image Caption (better than

More information

SOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON).

SOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON). SOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON). 1. Some easy problems. 1.1. Guessing a number. Someone chose a number x between 1 and N. You are allowed to ask questions: Is this number larger

More information

FAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING

FAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING FAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING Harman Jot, Rupinder Kaur M.Tech, Department of Electronics and Communication, Punjabi University, Patiala, Punjab, India I. INTRODUCTION

More information

MAT3707. Tutorial letter 202/1/2017 DISCRETE MATHEMATICS: COMBINATORICS. Semester 1. Department of Mathematical Sciences MAT3707/202/1/2017

MAT3707. Tutorial letter 202/1/2017 DISCRETE MATHEMATICS: COMBINATORICS. Semester 1. Department of Mathematical Sciences MAT3707/202/1/2017 MAT3707/0//07 Tutorial letter 0//07 DISCRETE MATHEMATICS: COMBINATORICS MAT3707 Semester Department of Mathematical Sciences SOLUTIONS TO ASSIGNMENT 0 BARCODE Define tomorrow university of south africa

More information

ECE Advanced Communication Theory, Spring 2007 Midterm Exam Monday, April 23rd, 6:00-9:00pm, ELAB 325

ECE Advanced Communication Theory, Spring 2007 Midterm Exam Monday, April 23rd, 6:00-9:00pm, ELAB 325 C 745 - Advanced Communication Theory, Spring 2007 Midterm xam Monday, April 23rd, 600-900pm, LAB 325 Overview The exam consists of five problems for 150 points. The points for each part of each problem

More information

Entropy Coding. Outline. Entropy. Definitions. log. A = {a, b, c, d, e}

Entropy Coding. Outline. Entropy. Definitions. log. A = {a, b, c, d, e} Outline efinition of ntroy Three ntroy coding techniques: Huffman coding rithmetic coding Lemel-Ziv coding ntroy oding (taken from the Technion) ntroy ntroy of a set of elements e,,e n with robabilities,

More information

Error Detection and Correction

Error Detection and Correction . Error Detection and Companies, 27 CHAPTER Error Detection and Networks must be able to transfer data from one device to another with acceptable accuracy. For most applications, a system must guarantee

More information

DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS

DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK Subject Name: Year /Sem: II / IV UNIT I INFORMATION ENTROPY FUNDAMENTALS PART A (2 MARKS) 1. What is uncertainty? 2. What is prefix coding? 3. State the

More information

Exercises to Chapter 2 solutions

Exercises to Chapter 2 solutions Exercises to Chapter 2 solutions 1 Exercises to Chapter 2 solutions E2.1 The Manchester code was first used in Manchester Mark 1 computer at the University of Manchester in 1949 and is still used in low-speed

More information

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK SUB.NAME : COMMUNICATION THEORY SUB.CODE: EC1252 YEAR : II SEMESTER : IV UNIT I AMPLITUDE MODULATION SYSTEMS

More information

CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES

CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES 119 CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES 5.1 INTRODUCTION In this work the peak powers of the OFDM signal is reduced by applying Adaptive Huffman Codes (AHC). First the encoding

More information

GENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE

GENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE GENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE Wook-Hyun Jeong and Yo-Sung Ho Kwangju Institute of Science and Technology (K-JIST) Oryong-dong, Buk-gu, Kwangju,

More information

Error Correction with Hamming Codes

Error Correction with Hamming Codes Hamming Codes http://www2.rad.com/networks/1994/err_con/hamming.htm Error Correction with Hamming Codes Forward Error Correction (FEC), the ability of receiving station to correct a transmission error,

More information

THE idea behind constellation shaping is that signals with

THE idea behind constellation shaping is that signals with IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 52, NO. 3, MARCH 2004 341 Transactions Letters Constellation Shaping for Pragmatic Turbo-Coded Modulation With High Spectral Efficiency Dan Raphaeli, Senior Member,

More information

EE521 Analog and Digital Communications

EE521 Analog and Digital Communications EE521 Analog and Digital Communications Questions Problem 1: SystemView... 3 Part A (25%... 3... 3 Part B (25%... 3... 3 Voltage... 3 Integer...3 Digital...3 Part C (25%... 3... 4 Part D (25%... 4... 4

More information

Rab Nawaz. Prof. Zhang Wenyi

Rab Nawaz. Prof. Zhang Wenyi Rab Nawaz PhD Scholar (BL16006002) School of Information Science and Technology University of Science and Technology of China, Hefei Email: rabnawaz@mail.ustc.edu.cn Submitted to Prof. Zhang Wenyi wenyizha@ustc.edu.cn

More information

Hamming net based Low Complexity Successive Cancellation Polar Decoder

Hamming net based Low Complexity Successive Cancellation Polar Decoder Hamming net based Low Complexity Successive Cancellation Polar Decoder [1] Makarand Jadhav, [2] Dr. Ashok Sapkal, [3] Prof. Ram Patterkine [1] Ph.D. Student, [2] Professor, Government COE, Pune, [3] Ex-Head

More information

The ternary alphabet is used by alternate mark inversion modulation; successive ones in data are represented by alternating ±1.

The ternary alphabet is used by alternate mark inversion modulation; successive ones in data are represented by alternating ±1. Alphabets EE 387, Notes 2, Handout #3 Definition: An alphabet is a discrete (usually finite) set of symbols. Examples: B = {0,1} is the binary alphabet T = { 1,0,+1} is the ternary alphabet X = {00,01,...,FF}

More information

Computing and Communications 2. Information Theory -Channel Capacity

Computing and Communications 2. Information Theory -Channel Capacity 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Channel Capacity Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Communication

More information

SCHEME OF COURSE WORK. Course Code : 13EC1114 L T P C : ELECTRONICS AND COMMUNICATION ENGINEERING

SCHEME OF COURSE WORK. Course Code : 13EC1114 L T P C : ELECTRONICS AND COMMUNICATION ENGINEERING SCHEME OF COURSE WORK Course Details: Course Title : DIGITAL COMMUNICATIONS Course Code : 13EC1114 L T P C 4 0 0 3 Program Specialization Semester Prerequisites Courses to which it is a prerequisite :

More information

Outline. Communications Engineering 1

Outline. Communications Engineering 1 Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband channels Signal space representation Optimal

More information

SHANNON S source channel separation theorem states

SHANNON S source channel separation theorem states IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 9, SEPTEMBER 2009 3927 Source Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Member, IEEE, Elza Erkip, Senior Member,

More information

Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless

Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 28-30, 2011 Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless Zhiyu Cheng, Natasha

More information

METHOD 1: METHOD 2: 4D METHOD 1: METHOD 2:

METHOD 1: METHOD 2: 4D METHOD 1: METHOD 2: 4A Strategy: Count how many times each digit appears. There are sixteen 4s, twelve 3s, eight 2s, four 1s, and one 0. The sum of the digits is (16 4) + + (8 2) + (4 1) = 64 + 36 +16+4= 120. 4B METHOD 1:

More information

DEGRADED broadcast channels were first studied by

DEGRADED broadcast channels were first studied by 4296 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 9, SEPTEMBER 2008 Optimal Transmission Strategy Explicit Capacity Region for Broadcast Z Channels Bike Xie, Student Member, IEEE, Miguel Griot,

More information

Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology

Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology Course Presentation Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology Data Compression Motivation Data storage and transmission cost money Use fewest number of

More information

Digital Television Lecture 5

Digital Television Lecture 5 Digital Television Lecture 5 Forward Error Correction (FEC) Åbo Akademi University Domkyrkotorget 5 Åbo 8.4. Error Correction in Transmissions Need for error correction in transmissions Loss of data during

More information

(CSC-3501) Lecture 6 (31 Jan 2008) Seung-Jong Park (Jay) CSC S.J. Park. Announcement

(CSC-3501) Lecture 6 (31 Jan 2008) Seung-Jong Park (Jay)   CSC S.J. Park. Announcement Seung-Jong Park (Jay) http://www.csc.lsu.edu/~sjpark Computer Architecture (CSC-3501) Lecture 6 (31 Jan 2008) 1 Announcement 2 1 Reminder A logic circuit is composed of: Inputs Outputs Functional specification

More information

Lecture 1 Introduction

Lecture 1 Introduction Lecture 1 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 22, 2015 1 / 46 I-Hsiang Wang IT Lecture 1 Information Theory Information

More information

Name: Unit 7 Study Guide 1. Use the spinner to name the color that fits each of the following statements.

Name: Unit 7 Study Guide 1. Use the spinner to name the color that fits each of the following statements. 1. Use the spinner to name the color that fits each of the following statements. green blue white white blue a. The spinner will land on this color about as often as it lands on white. b. The chance of

More information

Lossless Image Compression Techniques Comparative Study

Lossless Image Compression Techniques Comparative Study Lossless Image Compression Techniques Comparative Study Walaa Z. Wahba 1, Ashraf Y. A. Maghari 2 1M.Sc student, Faculty of Information Technology, Islamic university of Gaza, Gaza, Palestine 2Assistant

More information

Keywords Audio Steganography, Compressive Algorithms, SNR, Capacity, Robustness. (Figure 1: The Steganographic operation) [10]

Keywords Audio Steganography, Compressive Algorithms, SNR, Capacity, Robustness. (Figure 1: The Steganographic operation) [10] Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Audio Steganography

More information

Binary trees. Application: AVL trees and the golden ratio. The golden ratio. φ 1=1/φ 1. φ =1+

Binary trees. Application: AVL trees and the golden ratio. The golden ratio. φ 1=1/φ 1. φ =1+ Application: AVL trees and the golden ratio AVL trees are used for storing information in an efficient manner. We will see exactly how in the data structures course. This slide set takes a look at how

More information

Reduce, Reuse, Recycle

Reduce, Reuse, Recycle Reduce, Reuse, Recycle Performance Activity Materials: Station 1 Consumable none Nonconsumable 1 paper lunch bag an assortment of trash items such as: aluminum foil, 1 plastic straw, 1 craft stick, 1 foam

More information

Time division multiplexing The block diagram for TDM is illustrated as shown in the figure

Time division multiplexing The block diagram for TDM is illustrated as shown in the figure CHAPTER 2 Syllabus: 1) Pulse amplitude modulation 2) TDM 3) Wave form coding techniques 4) PCM 5) Quantization noise and SNR 6) Robust quantization Pulse amplitude modulation In pulse amplitude modulation,

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 6, JUNE

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 6, JUNE IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 6, JUNE 2009 2659 Rank Modulation for Flash Memories Anxiao (Andrew) Jiang, Member, IEEE, Robert Mateescu, Member, IEEE, Moshe Schwartz, Member, IEEE,

More information

Digital Communication Systems ECS 452

Digital Communication Systems ECS 452 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 2. Source Coding 1 Office Hours: BKD, 6th floor of Sirindhralai building Monday 10:00-10:40 Tuesday 12:00-12:40

More information

Scheduling in omnidirectional relay wireless networks

Scheduling in omnidirectional relay wireless networks Scheduling in omnidirectional relay wireless networks by Shuning Wang A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of Applied Science

More information

PROBABILITY AND STATISTICS Vol. II - Information Theory and Communication - Tibor Nemetz INFORMATION THEORY AND COMMUNICATION

PROBABILITY AND STATISTICS Vol. II - Information Theory and Communication - Tibor Nemetz INFORMATION THEORY AND COMMUNICATION INFORMATION THEORY AND COMMUNICATION Tibor Nemetz Rényi Mathematical Institute, Hungarian Academy of Sciences, Budapest, Hungary Keywords: Shannon theory, alphabet, capacity, (transmission) channel, channel

More information

The tenure game. The tenure game. Winning strategies for the tenure game. Winning condition for the tenure game

The tenure game. The tenure game. Winning strategies for the tenure game. Winning condition for the tenure game The tenure game The tenure game is played by two players Alice and Bob. Initially, finitely many tokens are placed at positions that are nonzero natural numbers. Then Alice and Bob alternate in their moves

More information

HUFFMAN CODING. Catherine Bénéteau and Patrick J. Van Fleet. SACNAS 2009 Mini Course. University of South Florida and University of St.

HUFFMAN CODING. Catherine Bénéteau and Patrick J. Van Fleet. SACNAS 2009 Mini Course. University of South Florida and University of St. Catherine Bénéteau and Patrick J. Van Fleet University of South Florida and University of St. Thomas SACNAS 2009 Mini Course WEDNESDAY, 14 OCTOBER, 2009 (1:40-3:00) LECTURE 2 SACNAS 2009 1 / 10 All lecture

More information

CS-171, Intro to A.I. Mid-term Exam Winter Quarter, 2015

CS-171, Intro to A.I. Mid-term Exam Winter Quarter, 2015 CS-171, Intro to A.I. Mid-term Exam Winter Quarter, 2015 YUR NAME: YUR ID: ID T RIGHT: RW: SEAT: The exam will begin on the next page. Please, do not turn the page until told. When you are told to begin

More information

Chapter 4 Cyclotomic Cosets, the Mattson Solomon Polynomial, Idempotents and Cyclic Codes

Chapter 4 Cyclotomic Cosets, the Mattson Solomon Polynomial, Idempotents and Cyclic Codes Chapter 4 Cyclotomic Cosets, the Mattson Solomon Polynomial, Idempotents and Cyclic Codes 4.1 Introduction Much of the pioneering research on cyclic codes was carried out by Prange [5]inthe 1950s and considerably

More information

END-OF-YEAR EXAMINATIONS ELEC321 Communication Systems (D2) Tuesday, 22 November 2005, 9:20 a.m. Three hours plus 10 minutes reading time.

END-OF-YEAR EXAMINATIONS ELEC321 Communication Systems (D2) Tuesday, 22 November 2005, 9:20 a.m. Three hours plus 10 minutes reading time. END-OF-YEAR EXAMINATIONS 2005 Unit: Day and Time: Time Allowed: ELEC321 Communication Systems (D2) Tuesday, 22 November 2005, 9:20 a.m. Three hours plus 10 minutes reading time. Total Number of Questions:

More information

Comm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding

Comm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding Comm. 50: Communication Theory Lecture 6 - Introduction to Source Coding Digital Communication Systems Source of Information User of Information Source Encoder Source Decoder Channel Encoder Channel Decoder

More information

6. FUNDAMENTALS OF CHANNEL CODER

6. FUNDAMENTALS OF CHANNEL CODER 82 6. FUNDAMENTALS OF CHANNEL CODER 6.1 INTRODUCTION The digital information can be transmitted over the channel using different signaling schemes. The type of the signal scheme chosen mainly depends on

More information

COURSE MATERIAL Subject Name: Communication Theory UNIT V

COURSE MATERIAL Subject Name: Communication Theory UNIT V NH-67, TRICHY MAIN ROAD, PULIYUR, C.F. - 639114, KARUR DT. DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING COURSE MATERIAL Subject Name: Communication Theory Subject Code: 080290020 Class/Sem:

More information

The Z Channel. Nihar Jindal Department of Electrical Engineering Stanford University, Stanford, CA

The Z Channel. Nihar Jindal Department of Electrical Engineering Stanford University, Stanford, CA The Z Channel Sriram Vishwanath Dept. of Elec. and Computer Engg. Univ. of Texas at Austin, Austin, TX E-mail : sriram@ece.utexas.edu Nihar Jindal Department of Electrical Engineering Stanford University,

More information

MAS160: Signals, Systems & Information for Media Technology. Problem Set 4. DUE: October 20, 2003

MAS160: Signals, Systems & Information for Media Technology. Problem Set 4. DUE: October 20, 2003 MAS160: Signals, Systems & Information for Media Technology Problem Set 4 DUE: October 20, 2003 Instructors: V. Michael Bove, Jr. and Rosalind Picard T.A. Jim McBride Problem 1: Simple Psychoacoustic Masking

More information

5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Interference Channels With Correlated Receiver Side Information Nan Liu, Member, IEEE, Deniz Gündüz, Member, IEEE, Andrea J.

More information

Punctured vs Rateless Codes for Hybrid ARQ

Punctured vs Rateless Codes for Hybrid ARQ Punctured vs Rateless Codes for Hybrid ARQ Emina Soljanin Mathematical and Algorithmic Sciences Research, Bell Labs Collaborations with R. Liu, P. Spasojevic, N. Varnica and P. Whiting Tsinghua University

More information

Design of Parallel Algorithms. Communication Algorithms

Design of Parallel Algorithms. Communication Algorithms + Design of Parallel Algorithms Communication Algorithms + Topic Overview n One-to-All Broadcast and All-to-One Reduction n All-to-All Broadcast and Reduction n All-Reduce and Prefix-Sum Operations n Scatter

More information

Fall 2017 March 13, Written Homework 4

Fall 2017 March 13, Written Homework 4 CS1800 Discrete Structures Profs. Aslam, Gold, & Pavlu Fall 017 March 13, 017 Assigned: Fri Oct 7 017 Due: Wed Nov 8 017 Instructions: Written Homework 4 The assignment has to be uploaded to blackboard

More information

Introduction to Coding Theory

Introduction to Coding Theory Coding Theory Massoud Malek Introduction to Coding Theory Introduction. Coding theory originated with the advent of computers. Early computers were huge mechanical monsters whose reliability was low compared

More information

Coding Techniques and the Two-Access Channel

Coding Techniques and the Two-Access Channel Coding Techniques and the Two-Access Channel A.J. Han VINCK Institute for Experimental Mathematics, University of Duisburg-Essen, Germany email: Vinck@exp-math.uni-essen.de Abstract. We consider some examples

More information

Your Name and ID. (a) ( 3 points) Breadth First Search is complete even if zero step-costs are allowed.

Your Name and ID. (a) ( 3 points) Breadth First Search is complete even if zero step-costs are allowed. 1 UC Davis: Winter 2003 ECS 170 Introduction to Artificial Intelligence Final Examination, Open Text Book and Open Class Notes. Answer All questions on the question paper in the spaces provided Show all

More information

MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007

MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007 MIT OpenCourseWare http://ocw.mit.edu MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007 For information about citing these materials or our Terms of Use, visit:

More information

A High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction

A High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction 1514 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 10, NO. 8, DECEMBER 2000 A High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction Bai-Jue Shieh, Yew-San Lee,

More information

SYLLABUS of the course BASIC ELECTRONICS AND DIGITAL SIGNAL PROCESSING. Master in Computer Science, University of Bolzano-Bozen, a.y.

SYLLABUS of the course BASIC ELECTRONICS AND DIGITAL SIGNAL PROCESSING. Master in Computer Science, University of Bolzano-Bozen, a.y. SYLLABUS of the course BASIC ELECTRONICS AND DIGITAL SIGNAL PROCESSING Master in Computer Science, University of Bolzano-Bozen, a.y. 2017-2018 Lecturer: LEONARDO RICCI (last updated on November 27, 2017)

More information

MULTILEVEL CODING (MLC) with multistage decoding

MULTILEVEL CODING (MLC) with multistage decoding 350 IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 52, NO. 3, MARCH 2004 Power- and Bandwidth-Efficient Communications Using LDPC Codes Piraporn Limpaphayom, Student Member, IEEE, and Kim A. Winick, Senior

More information

Embedded Orthogonal Space-Time Codes for High Rate and Low Decoding Complexity

Embedded Orthogonal Space-Time Codes for High Rate and Low Decoding Complexity Embedded Orthogonal Space-Time Codes for High Rate and Low Decoding Complexity Mohanned O. Sinnokrot, John R. Barry and Vijay K. Madisetti eorgia Institute of Technology, Atlanta, A 3033 USA, {sinnokrot,

More information

Performance Evaluation of Low Density Parity Check codes with Hard and Soft decision Decoding

Performance Evaluation of Low Density Parity Check codes with Hard and Soft decision Decoding Performance Evaluation of Low Density Parity Check codes with Hard and Soft decision Decoding Shalini Bahel, Jasdeep Singh Abstract The Low Density Parity Check (LDPC) codes have received a considerable

More information

Balanced Trees. Balanced Trees Tree. 2-3 Tree. 2 Node. Binary search trees are not guaranteed to be balanced given random inserts and deletes

Balanced Trees. Balanced Trees Tree. 2-3 Tree. 2 Node. Binary search trees are not guaranteed to be balanced given random inserts and deletes Balanced Trees Balanced Trees 23 Tree Binary search trees are not guaranteed to be balanced given random inserts and deletes! Tree could degrade to O(n) operations Balanced search trees! Operations maintain

More information

Performance Optimization of Hybrid Combination of LDPC and RS Codes Using Image Transmission System Over Fading Channels

Performance Optimization of Hybrid Combination of LDPC and RS Codes Using Image Transmission System Over Fading Channels European Journal of Scientific Research ISSN 1450-216X Vol.35 No.1 (2009), pp 34-42 EuroJournals Publishing, Inc. 2009 http://www.eurojournals.com/ejsr.htm Performance Optimization of Hybrid Combination

More information

DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty

DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty Name: S.Kalpana, AP/ECE QUESTION BANK UNIT I ANALOG

More information

CSI33 Data Structures

CSI33 Data Structures Department of Mathematics and Computer Science Bronx Community College Outline Chapter 7: Trees 1 Chapter 7: Trees Uses Of Trees Chapter 7: Trees Taxonomies animal vertebrate invertebrate fish mammal reptile

More information

Channel capacity and error exponents of variable rate adaptive channel coding for Rayleigh fading channels. Title

Channel capacity and error exponents of variable rate adaptive channel coding for Rayleigh fading channels. Title Title Channel capacity and error exponents of variable rate adaptive channel coding for Rayleigh fading channels Author(s) Lau, KN Citation IEEE Transactions on Communications, 1999, v. 47 n. 9, p. 1345-1356

More information

Run-Length Based Huffman Coding

Run-Length Based Huffman Coding Chapter 5 Run-Length Based Huffman Coding This chapter presents a multistage encoding technique to reduce the test data volume and test power in scan-based test applications. We have proposed a statistical

More information

Tarek M. Sobh and Tarek Alameldin

Tarek M. Sobh and Tarek Alameldin Operator/System Communication : An Optimizing Decision Tool Tarek M. Sobh and Tarek Alameldin Department of Computer and Information Science School of Engineering and Applied Science University of Pennsylvania,

More information

EE303: Communication Systems

EE303: Communication Systems EE303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London An Overview of Fundamentals: Channels, Criteria and Limits Prof. A. Manikas (Imperial

More information

ECE 6640 Digital Communications

ECE 6640 Digital Communications ECE 6640 Digital Communications Dr. Bradley J. Bazuin Assistant Professor Department of Electrical and Computer Engineering College of Engineering and Applied Sciences Chapter 8 8. Channel Coding: Part

More information