UCSD ECE154C Handout #21 Prof. Young-Han Kim Thursday, April 28, Midterm Solutions (Prepared by TA Shouvik Ganguly)

Size: px
Start display at page:

Download "UCSD ECE154C Handout #21 Prof. Young-Han Kim Thursday, April 28, Midterm Solutions (Prepared by TA Shouvik Ganguly)"

Transcription

1 UCSD ECE54C Handout #2 Prof. Young-Han Kim Thursday, April 28, 26 Midterm Solutions (Prepared by TA Shouvik Ganguly) There are 3 problems, each problem with multiple parts, each part worth points. Your answer should be as clear and readable as possible. Please justify any claim that you make.. Binary codes (3 points). Consider a source that emits five symbols {A,B,C,D,E} with probabilities.3,.3,.2,., and., respectively. (a) Construct a binary Huffman code for this source, taking one source symbol at a time. What is the average codeword length for this code? (b) Repeat part (a) for a binary Shannon Fano code taking one source symbol at a time. (c) Construct a probability distribution p = (p A,p B,p C,p D,p E ) on {A,B,C,D,E}, for which the code that you constructed in part (a) has an average length equal to its binary entropy H(p). Solution : (a) One possible way to do the Huffman coding is shown below. A.3.6 B.3 C.2. D. E..2.4

2 This gives the encoding A B C D E The average codeword length is given by L = 2 ( )+3 (.+.) = 2.2 bits per symbol. Alternatively, the average length can be computed by adding the probabilities at the internal nodes, so we have L = = 2.2 bits per symbol. 2

3 (b) The procedure for constructing a binary Shannon Fano code is shown below. A B C D E..6.4 A B C D E.3 A.3 B.2.2 C D E D. E. This gives the encoding A B C D E, which is the same as the Huffman code. Thus, the average length is the same. Similar to part (a), the average length can also be computed by adding the probabilities at the internal nodes, giving L = = 2.2 bits per symbol. 3

4 (c) We see that for the Huffman code we constructed, the codeword lengths satisfy i 2 l i =. Hence, if we have a probability distribution with p i = 2 l i, our code will have average codeword length given by L = i p i l i = i p i logp i (since p i = 2 l i ) = H(p), i.e., the entropy of the distribution. This shows that the required probabilities are p A = p B = p C = /4, p D = p E = /8. This gives an average codeword length of L = 2.25 bits per symbol, which is the same as H(p A,p B,p C,p D,p E ). 2. One-bit quantizer (3 points). Let X be drawn according to the pdf f X (x) 2 f X (x) = { 2x, x,, otherwise. (a) Given the quantization regions R = {x : x < b} and R 2 = {x : b x }, find the quantization points a R and a 2 R 2 that minimize the mean squared error (MSE) in terms of b. (b) Giventhequantizationpointsa < a 2, findthequantizationregionsthatminimize the MSE in terms of a and a 2. (c) Using parts (a) and (b), find the optimal quantizer by specifying a,a 2, and b that minimize the MSE. Solution : x 4

5 (a) Using the centroid condition, we have and a 2 = a = b 2x2 dx b 2xdx = (2/3)b3 b 2 = 2 3 b b 2x2 dx b 2xdx = (2/3)( b3 ) b 2 = 2(b2 +b+). 3(b+) (b) Using the Voronoi condition, the quantization regions are given by R = [,b) and R 2 = [b,], where b = a +a 2. 2 (c) Plugging the expressions for a and a 2 computed in part (a) into the expression in part (b), we have 2b = 2 3 b+ 2(b2 +b+) 3(b+) = 4 3 b = 2(b2 +b+) 3(b+) = 2b = b2 +b+ b+ = 2b 2 +2b = b 2 +b+ = b 2 +b = 5 = b =. 2 Plugging this value into the expressions in part (a), we have a = 5 3 and a 2 = 2( 5 ). 3 5

6 3. Source coding for emergency (2 points). Consider a source that emits six symbols {A,B,C,D,E,F} with probabilities.3,.2,.2,.,., and., respectively. Here the symbol F only has the probability., but it is meant for an emergency message, say, Fire!, and hence we would like to spend few bits in encoding F. (a) Design a binary instantaneous code that takes one source symbol at a time with the shortest possible average codeword length under the constraint that F should be mapped to the codeword of length. (b) (Difficult.) Now assume that each source symbol i has some associated cost per letter c i of the codeword. The normal symbols A,B,C,D,E has a cost of c i = per codeword letter, while the emergency symbol F has a cost of c i = 4 per code letter. Hence, if the codeword for symbol i has the length i, then the cost spent for the symbol is c i l i. For example, the code that maps A B C D E F spends the cost of 4,5,4,3,2, and 8, respectively, for the symbols A,B,C,D,E, and F. Design a binary instantaneous code that minimizes the average cost p i c i l i of sending a symbol. Solution : i (a) Since F is mapped to and the code has to be instantaneous, all other codewords must start with. In order to ensure that we have the shortest possible average codeword length, we can perform Huffman coding on the source symbols other than F, and finally add at the beginning of these other codewords. Note that the total probability of the symbols other than F is not ; however, the Huffman coding algorithm does not require the numbers associated with the symbols to sum up to ; any set of non-negative weights can be used, and the resulting encoding will then give the code with the shortest average codeword length, where the average is computed using the corresponding weights. 6

7 One way to perform the Huffman coding is as follows. A.3.5 B.2 C.2.9 D..4.2 E. This, together with the preceding discussion, gives the encoding A B C D E F The average codeword length is given by L = 3 ( )+4 (.+.)+. = 3. bits per symbol. Alternatively, the average codeword length can also be computed as L = +( sum of probabilities at internal nodes of the Huffman coding tree ) = = 3. bits per symbol. (b) Wearerequired tominimize i p ic i l i = i q il i, where q i = p i c i. Asaconsequence of the argument provided in part (a), in order to minimize the average cost, we can use Huffman coding with the probability of a symbol replaced by the 7

8 corresponding weight q i. The procedure is shown below. A.3.5 B.2.3 F.4 C.2.8 D..4.2 E. This gives the encoding A B C D E F The average cost is given by C = 2 ( ) (.+.) = 3.2 per symbol. Alternatively, the average cost can be computed as the sum of the weights at the internal nodes, and is thus given by C = = 3.2 per symbol. 8

Lecture5: Lossless Compression Techniques

Lecture5: Lossless Compression Techniques Fixed to fixed mapping: we encoded source symbols of fixed length into fixed length code sequences Fixed to variable mapping: we encoded source symbols of fixed length into variable length code sequences

More information

Information Theory and Communication Optimal Codes

Information Theory and Communication Optimal Codes Information Theory and Communication Optimal Codes Ritwik Banerjee rbanerjee@cs.stonybrook.edu c Ritwik Banerjee Information Theory and Communication 1/1 Roadmap Examples and Types of Codes Kraft Inequality

More information

Introduction to Source Coding

Introduction to Source Coding Comm. 52: Communication Theory Lecture 7 Introduction to Source Coding - Requirements of source codes - Huffman Code Length Fixed Length Variable Length Source Code Properties Uniquely Decodable allow

More information

Coding for Efficiency

Coding for Efficiency Let s suppose that, over some channel, we want to transmit text containing only 4 symbols, a, b, c, and d. Further, let s suppose they have a probability of occurrence in any block of text we send as follows

More information

LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR

LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 1 LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 2 STORAGE SPACE Uncompressed graphics, audio, and video data require substantial storage capacity. Storing uncompressed video is not possible

More information

Solutions to Assignment-2 MOOC-Information Theory

Solutions to Assignment-2 MOOC-Information Theory Solutions to Assignment-2 MOOC-Information Theory 1. Which of the following is a prefix-free code? a) 01, 10, 101, 00, 11 b) 0, 11, 01 c) 01, 10, 11, 00 Solution:- The codewords of (a) are not prefix-free

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 13: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 22 th, 2015 1 o Source Code Generation Lecture Outlines Source Coding

More information

# 12 ECE 253a Digital Image Processing Pamela Cosman 11/4/11. Introductory material for image compression

# 12 ECE 253a Digital Image Processing Pamela Cosman 11/4/11. Introductory material for image compression # 2 ECE 253a Digital Image Processing Pamela Cosman /4/ Introductory material for image compression Motivation: Low-resolution color image: 52 52 pixels/color, 24 bits/pixel 3/4 MB 3 2 pixels, 24 bits/pixel

More information

ECE Advanced Communication Theory, Spring 2007 Midterm Exam Monday, April 23rd, 6:00-9:00pm, ELAB 325

ECE Advanced Communication Theory, Spring 2007 Midterm Exam Monday, April 23rd, 6:00-9:00pm, ELAB 325 C 745 - Advanced Communication Theory, Spring 2007 Midterm xam Monday, April 23rd, 600-900pm, LAB 325 Overview The exam consists of five problems for 150 points. The points for each part of each problem

More information

MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007

MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007 MIT OpenCourseWare http://ocw.mit.edu MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007 For information about citing these materials or our Terms of Use, visit:

More information

MAS160: Signals, Systems & Information for Media Technology. Problem Set 4. DUE: October 20, 2003

MAS160: Signals, Systems & Information for Media Technology. Problem Set 4. DUE: October 20, 2003 MAS160: Signals, Systems & Information for Media Technology Problem Set 4 DUE: October 20, 2003 Instructors: V. Michael Bove, Jr. and Rosalind Picard T.A. Jim McBride Problem 1: Simple Psychoacoustic Masking

More information

Information Theory and Huffman Coding

Information Theory and Huffman Coding Information Theory and Huffman Coding Consider a typical Digital Communication System: A/D Conversion Sampling and Quantization D/A Conversion Source Encoder Source Decoder bit stream bit stream Channel

More information

COMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam

COMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Dr.-Ing. Heiko Schwarz COMM901 Source Coding and Compression Winter Semester

More information

A Brief Introduction to Information Theory and Lossless Coding

A Brief Introduction to Information Theory and Lossless Coding A Brief Introduction to Information Theory and Lossless Coding 1 INTRODUCTION This document is intended as a guide to students studying 4C8 who have had no prior exposure to information theory. All of

More information

Entropy, Coding and Data Compression

Entropy, Coding and Data Compression Entropy, Coding and Data Compression Data vs. Information yes, not, yes, yes, not not In ASCII, each item is 3 8 = 24 bits of data But if the only possible answers are yes and not, there is only one bit

More information

Images with (a) coding redundancy; (b) spatial redundancy; (c) irrelevant information

Images with (a) coding redundancy; (b) spatial redundancy; (c) irrelevant information Images with (a) coding redundancy; (b) spatial redundancy; (c) irrelevant information 1992 2008 R. C. Gonzalez & R. E. Woods For the image in Fig. 8.1(a): 1992 2008 R. C. Gonzalez & R. E. Woods Measuring

More information

DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS

DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK Subject Name: Year /Sem: II / IV UNIT I INFORMATION ENTROPY FUNDAMENTALS PART A (2 MARKS) 1. What is uncertainty? 2. What is prefix coding? 3. State the

More information

Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication

Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication 1 Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING 1.1 SOURCE CODING Whether a source is analog or digital, a digital communication system is designed to transmit information in digital form.

More information

1 This work was partially supported by NSF Grant No. CCR , and by the URI International Engineering Program.

1 This work was partially supported by NSF Grant No. CCR , and by the URI International Engineering Program. Combined Error Correcting and Compressing Codes Extended Summary Thomas Wenisch Peter F. Swaszek Augustus K. Uht 1 University of Rhode Island, Kingston RI Submitted to International Symposium on Information

More information

Module 3 Greedy Strategy

Module 3 Greedy Strategy Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main

More information

Digital Communication Systems ECS 452

Digital Communication Systems ECS 452 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 2. Source Coding 1 Office Hours: BKD, 6th floor of Sirindhralai building Monday 10:00-10:40 Tuesday 12:00-12:40

More information

VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad ASSIGNMENT QUESTION BANK

VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad ASSIGNMENT QUESTION BANK VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad ASSIGNMENT QUESTION BANK Name of the subject: Digital Communications B.Tech/M.Tech/MCA/MBA Subject Code: A1424 Semester: VI

More information

Channel Coding/Decoding. Hamming Method

Channel Coding/Decoding. Hamming Method Channel Coding/Decoding Hamming Method INFORMATION TRANSFER ACROSS CHANNELS Sent Received messages symbols messages source encoder Source coding Channel coding Channel Channel Source decoder decoding decoding

More information

B. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided.

B. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided. " 11111111111111111111111111111111111111111111111111111111111111III *U-3091/8400* Printed Pages : 7 TEC - 601! I i B. Tech. (SEM. VI) EXAMINATION, 2007-08 DIGIT AL COMMUNICATION \ V Time: 3 Hours] [Total

More information

EE521 Analog and Digital Communications

EE521 Analog and Digital Communications EE521 Analog and Digital Communications Questions Problem 1: SystemView... 3 Part A (25%... 3... 3 Part B (25%... 3... 3 Voltage... 3 Integer...3 Digital...3 Part C (25%... 3... 4 Part D (25%... 4... 4

More information

Module 3 Greedy Strategy

Module 3 Greedy Strategy Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main

More information

FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY

FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY 1 Information Transmission Chapter 5, Block codes FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY 2 Methods of channel coding For channel coding (error correction) we have two main classes of codes,

More information

Module 8: Video Coding Basics Lecture 40: Need for video coding, Elements of information theory, Lossless coding. The Lecture Contains:

Module 8: Video Coding Basics Lecture 40: Need for video coding, Elements of information theory, Lossless coding. The Lecture Contains: The Lecture Contains: The Need for Video Coding Elements of a Video Coding System Elements of Information Theory Symbol Encoding Run-Length Encoding Entropy Encoding file:///d /...Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2040/40_1.htm[12/31/2015

More information

Department of Electronics and Communication Engineering 1

Department of Electronics and Communication Engineering 1 UNIT I SAMPLING AND QUANTIZATION Pulse Modulation 1. Explain in detail the generation of PWM and PPM signals (16) (M/J 2011) 2. Explain in detail the concept of PWM and PAM (16) (N/D 2012) 3. What is the

More information

LIST 04 Submission Date: 04/05/2017; Cut-off: 14/05/2017. Part 1 Theory. Figure 1: horizontal profile of the R, G and B components.

LIST 04 Submission Date: 04/05/2017; Cut-off: 14/05/2017. Part 1 Theory. Figure 1: horizontal profile of the R, G and B components. Universidade de Brasília (UnB) Faculdade de Tecnologia (FT) Departamento de Engenharia Elétrica (ENE) Course: Image Processing Prof. Mylène C.Q. de Farias Semester: 2017.1 LIST 04 Submission Date: 04/05/2017;

More information

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

Greedy Algorithms. Kleinberg and Tardos, Chapter 4 Greedy Algorithms Kleinberg and Tardos, Chapter 4 1 Selecting gas stations Road trip from Fort Collins to Durango on a given route with length L, and fuel stations at positions b i. Fuel capacity = C miles.

More information

Chapter 6: Memory: Information and Secret Codes. CS105: Great Insights in Computer Science

Chapter 6: Memory: Information and Secret Codes. CS105: Great Insights in Computer Science Chapter 6: Memory: Information and Secret Codes CS105: Great Insights in Computer Science Overview When we decide how to represent something in bits, there are some competing interests: easily manipulated/processed

More information

Outline. Communications Engineering 1

Outline. Communications Engineering 1 Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband channels Signal space representation Optimal

More information

DIGITAL COMMINICATIONS

DIGITAL COMMINICATIONS Code No: R346 R Set No: III B.Tech. I Semester Regular and Supplementary Examinations, December - 23 DIGITAL COMMINICATIONS (Electronics and Communication Engineering) Time: 3 Hours Max Marks: 75 Answer

More information

Optimized Codes for the Binary Coded Side-Information Problem

Optimized Codes for the Binary Coded Side-Information Problem Optimized Codes for the Binary Coded Side-Information Problem Anne Savard, Claudio Weidmann ETIS / ENSEA - Université de Cergy-Pontoise - CNRS UMR 8051 F-95000 Cergy-Pontoise Cedex, France Outline 1 Introduction

More information

CSE 100: BST AVERAGE CASE AND HUFFMAN CODES

CSE 100: BST AVERAGE CASE AND HUFFMAN CODES CSE 100: BST AVERAGE CASE AND HUFFMAN CODES Recap: Average Case Analysis of successful find in a BST N nodes Expected total depth of all BSTs with N nodes Recap: Probability of having i nodes in the left

More information

INSTITUTE OF AERONAUTICAL ENGINEERING (Autonomous) Dundigal, Hyderabad

INSTITUTE OF AERONAUTICAL ENGINEERING (Autonomous) Dundigal, Hyderabad INSTITUTE OF AERONAUTICAL ENGINEERING (Autonomous) Dundigal, Hyderabad - 500 03 ELECTRONICS AND COMMUNICATION ENGINEERING TUTORIAL QUESTION BANK Name : DIGITAL COMMUNICATIONS Code : A6020 Class : III -

More information

Monday, February 2, Is assigned today. Answers due by noon on Monday, February 9, 2015.

Monday, February 2, Is assigned today. Answers due by noon on Monday, February 9, 2015. Monday, February 2, 2015 Topics for today Homework #1 Encoding checkers and chess positions Constructing variable-length codes Huffman codes Homework #1 Is assigned today. Answers due by noon on Monday,

More information

FAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING

FAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING FAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING Harman Jot, Rupinder Kaur M.Tech, Department of Electronics and Communication, Punjabi University, Patiala, Punjab, India I. INTRODUCTION

More information

The University of Texas at Austin Dept. of Electrical and Computer Engineering Midterm #2. Prof. Brian L. Evans

The University of Texas at Austin Dept. of Electrical and Computer Engineering Midterm #2. Prof. Brian L. Evans The University of Texas at Austin Dept. of Electrical and Computer Engineering Midterm #2 Prof. Brian L. Evans Date: December 5, 2014 Course: EE 445S Name: Last, First The exam is scheduled to last 50

More information

Fundamentals of Digital Communication

Fundamentals of Digital Communication Fundamentals of Digital Communication Network Infrastructures A.A. 2017/18 Digital communication system Analog Digital Input Signal Analog/ Digital Low Pass Filter Sampler Quantizer Source Encoder Channel

More information

Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless

Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 28-30, 2011 Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless Zhiyu Cheng, Natasha

More information

Slides credited from Hsueh-I Lu, Hsu-Chun Hsiao, & Michael Tsai

Slides credited from Hsueh-I Lu, Hsu-Chun Hsiao, & Michael Tsai Slides credited from Hsueh-I Lu, Hsu-Chun Hsiao, & Michael Tsai Mini-HW 6 Released Due on 11/09 (Thu) 17:20 Homework 2 Due on 11/09 (Thur) 17:20 Midterm Time: 11/16 (Thur) 14:20-17:20 Format: close book

More information

SHANNON S source channel separation theorem states

SHANNON S source channel separation theorem states IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 9, SEPTEMBER 2009 3927 Source Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Member, IEEE, Elza Erkip, Senior Member,

More information

Symmetric Decentralized Interference Channels with Noisy Feedback

Symmetric Decentralized Interference Channels with Noisy Feedback 4 IEEE International Symposium on Information Theory Symmetric Decentralized Interference Channels with Noisy Feedback Samir M. Perlaza Ravi Tandon and H. Vincent Poor Institut National de Recherche en

More information

Homework Assignment #1

Homework Assignment #1 CS 540-2: Introduction to Artificial Intelligence Homework Assignment #1 Assigned: Thursday, February 1, 2018 Due: Sunday, February 11, 2018 Hand-in Instructions: This homework assignment includes two

More information

GENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE

GENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE GENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE Wook-Hyun Jeong and Yo-Sung Ho Kwangju Institute of Science and Technology (K-JIST) Oryong-dong, Buk-gu, Kwangju,

More information

The idea of similarity is through the Hamming

The idea of similarity is through the Hamming Hamming distance A good channel code is designed so that, if a few bit errors occur in transmission, the output can still be identified as the correct input. This is possible because although incorrect,

More information

DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty

DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty DHANALAKSHMI SRINIVASAN COLLEGE OF ENGINEERING AND TECHNOLOGY CS6304- ANALOG AND DIGITAL COMMUNICATION BE-CSE/IT SEMESTER III REGULATION 2013 Faculty Name: S.Kalpana, AP/ECE QUESTION BANK UNIT I ANALOG

More information

Error Detection and Correction: Parity Check Code; Bounds Based on Hamming Distance

Error Detection and Correction: Parity Check Code; Bounds Based on Hamming Distance Error Detection and Correction: Parity Check Code; Bounds Based on Hamming Distance Greg Plaxton Theory in Programming Practice, Spring 2005 Department of Computer Science University of Texas at Austin

More information

Digital Television Lecture 5

Digital Television Lecture 5 Digital Television Lecture 5 Forward Error Correction (FEC) Åbo Akademi University Domkyrkotorget 5 Åbo 8.4. Error Correction in Transmissions Need for error correction in transmissions Loss of data during

More information

Level 6 Graduate Diploma in Engineering Communication systems

Level 6 Graduate Diploma in Engineering Communication systems 9210-118 Level 6 Graduate Diploma in Engineering Communication systems Sample Paper You should have the following for this examination one answer book non-programmable calculator pen, pencil, ruler, drawing

More information

Wednesday, February 1, 2017

Wednesday, February 1, 2017 Wednesday, February 1, 2017 Topics for today Encoding game positions Constructing variable-length codes Huffman codes Encoding Game positions Some programs that play two-player games (e.g., tic-tac-toe,

More information

4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix

4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix Send SMS s : ONJntuSpeed To 9870807070 To Recieve Jntu Updates Daily On Your Mobile For Free www.strikingsoon.comjntu ONLINE EXMINTIONS [Mid 2 - dc] http://jntuk.strikingsoon.com 1. Two binary random

More information

DCSP-3: Minimal Length Coding. Jianfeng Feng

DCSP-3: Minimal Length Coding. Jianfeng Feng DCSP-3: Minimal Length Coding Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html Automatic Image Caption (better than

More information

5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Interference Channels With Correlated Receiver Side Information Nan Liu, Member, IEEE, Deniz Gündüz, Member, IEEE, Andrea J.

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 14: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 25 th, 2015 1 Previous Lecture: Source Code Generation: Lossless

More information

A High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction

A High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction 1514 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 10, NO. 8, DECEMBER 2000 A High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction Bai-Jue Shieh, Yew-San Lee,

More information

Part A: Question & Answers UNIT I AMPLITUDE MODULATION

Part A: Question & Answers UNIT I AMPLITUDE MODULATION PANDIAN SARASWATHI YADAV ENGINEERING COLLEGE DEPARTMENT OF ELECTRONICS & COMMUNICATON ENGG. Branch: ECE EC6402 COMMUNICATION THEORY Semester: IV Part A: Question & Answers UNIT I AMPLITUDE MODULATION 1.

More information

Background Dirty Paper Coding Codeword Binning Code construction Remaining problems. Information Hiding. Phil Regalia

Background Dirty Paper Coding Codeword Binning Code construction Remaining problems. Information Hiding. Phil Regalia Information Hiding Phil Regalia Department of Electrical Engineering and Computer Science Catholic University of America Washington, DC 20064 regalia@cua.edu Baltimore IEEE Signal Processing Society Chapter,

More information

TCET3202 Analog and digital Communications II

TCET3202 Analog and digital Communications II NEW YORK CITY COLLEGE OF TECHNOLOGY The City University of New York DEPARTMENT: SUBJECT CODE AND TITLE: COURSE DESCRIPTION: REQUIRED COURSE Electrical and Telecommunications Engineering Technology TCET3202

More information

15.Calculate the local oscillator frequency if incoming frequency is F1 and translated carrier frequency

15.Calculate the local oscillator frequency if incoming frequency is F1 and translated carrier frequency DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING SUBJECT NAME:COMMUNICATION THEORY YEAR/SEM: II/IV SUBJECT CODE: EC 6402 UNIT I:l (AMPLITUDE MODULATION) PART A 1. Compute the bandwidth of the AMP

More information

6.450: Principles of Digital Communication 1

6.450: Principles of Digital Communication 1 6.450: Principles of Digital Communication 1 Digital Communication: Enormous and normally rapidly growing industry, roughly comparable in size to the computer industry. Objective: Study those aspects of

More information

6. FUNDAMENTALS OF CHANNEL CODER

6. FUNDAMENTALS OF CHANNEL CODER 82 6. FUNDAMENTALS OF CHANNEL CODER 6.1 INTRODUCTION The digital information can be transmitted over the channel using different signaling schemes. The type of the signal scheme chosen mainly depends on

More information

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK SUB.NAME : COMMUNICATION THEORY SUB.CODE: EC1252 YEAR : II SEMESTER : IV UNIT I AMPLITUDE MODULATION SYSTEMS

More information

Huffman Coding - A Greedy Algorithm. Slides based on Kevin Wayne / Pearson-Addison Wesley

Huffman Coding - A Greedy Algorithm. Slides based on Kevin Wayne / Pearson-Addison Wesley - A Greedy Algorithm Slides based on Kevin Wayne / Pearson-Addison Wesley Greedy Algorithms Greedy Algorithms Build up solutions in small steps Make local decisions Previous decisions are never reconsidered

More information

Green Codes : Energy-efficient short-range communication

Green Codes : Energy-efficient short-range communication Green Codes : Energy-efficient short-range communication Pulkit Grover Department of Electrical Engineering and Computer Sciences University of California at Berkeley Joint work with Prof. Anant Sahai

More information

Sampling and Pulse Code Modulation Chapter 6

Sampling and Pulse Code Modulation Chapter 6 Sampling and Pulse Code Modulation Chapter 6 Dr. Yun Q. Shi Dept of Electrical & Computer Engineering New Jersey Institute of Technology shi@njit.edu Sampling Theorem A Signal is said to be band-limited

More information

A-D and D-A Converters

A-D and D-A Converters Chapter 5 A-D and D-A Converters (No mathematical derivations) 04 Hours 08 Marks When digital devices are to be interfaced with analog devices (or vice a versa), Digital to Analog converter and Analog

More information

AN INTRODUCTION TO ERROR CORRECTING CODES Part 2

AN INTRODUCTION TO ERROR CORRECTING CODES Part 2 AN INTRODUCTION TO ERROR CORRECTING CODES Part Jack Keil Wolf ECE 54 C Spring BINARY CONVOLUTIONAL CODES A binary convolutional code is a set of infinite length binary sequences which satisfy a certain

More information

Degrees of Freedom of Bursty Multiple Access Channels with a Relay

Degrees of Freedom of Bursty Multiple Access Channels with a Relay Fifty-third Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 29 - October 2, 205 Degrees of Freedom of Bursty Multiple Access Channels with a Relay Sunghyun im and Changho Suh Department

More information

Language of Instruction Course Level Short Cycle ( ) First Cycle (x) Second Cycle ( ) Third Cycle ( ) Term Local Credit ECTS Credit Fall 3 5

Language of Instruction Course Level Short Cycle ( ) First Cycle (x) Second Cycle ( ) Third Cycle ( ) Term Local Credit ECTS Credit Fall 3 5 Course Details Course Name Telecommunications II Language of Instruction English Course Level Short Cycle ( ) First Cycle (x) Second Cycle ( ) Third Cycle ( ) Course Type Course Code Compulsory (x) Elective

More information

Comparative Analysis of Lossless Image Compression techniques SPHIT, JPEG-LS and Data Folding

Comparative Analysis of Lossless Image Compression techniques SPHIT, JPEG-LS and Data Folding Comparative Analysis of Lossless Compression techniques SPHIT, JPEG-LS and Data Folding Mohd imran, Tasleem Jamal, Misbahul Haque, Mohd Shoaib,,, Department of Computer Engineering, Aligarh Muslim University,

More information

IEEE C /02R1. IEEE Mobile Broadband Wireless Access <http://grouper.ieee.org/groups/802/mbwa>

IEEE C /02R1. IEEE Mobile Broadband Wireless Access <http://grouper.ieee.org/groups/802/mbwa> 23--29 IEEE C82.2-3/2R Project Title Date Submitted IEEE 82.2 Mobile Broadband Wireless Access Soft Iterative Decoding for Mobile Wireless Communications 23--29

More information

Common Mistakes. Quick sort. Only choosing one pivot per iteration. At each iteration, one pivot per sublist should be chosen.

Common Mistakes. Quick sort. Only choosing one pivot per iteration. At each iteration, one pivot per sublist should be chosen. Common Mistakes Examples of typical mistakes Correct version Quick sort Only choosing one pivot per iteration. At each iteration, one pivot per sublist should be chosen. e.g. Use a quick sort to sort the

More information

Problem Sheet 1 Probability, random processes, and noise

Problem Sheet 1 Probability, random processes, and noise Problem Sheet 1 Probability, random processes, and noise 1. If F X (x) is the distribution function of a random variable X and x 1 x 2, show that F X (x 1 ) F X (x 2 ). 2. Use the definition of the cumulative

More information

Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology

Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology Course Presentation Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology Data Compression Motivation Data storage and transmission cost money Use fewest number of

More information

Comm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding

Comm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding Comm. 50: Communication Theory Lecture 6 - Introduction to Source Coding Digital Communication Systems Source of Information User of Information Source Encoder Source Decoder Channel Encoder Channel Decoder

More information

CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES

CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES 119 CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES 5.1 INTRODUCTION In this work the peak powers of the OFDM signal is reduced by applying Adaptive Huffman Codes (AHC). First the encoding

More information

Lecture 4: Wireless Physical Layer: Channel Coding. Mythili Vutukuru CS 653 Spring 2014 Jan 16, Thursday

Lecture 4: Wireless Physical Layer: Channel Coding. Mythili Vutukuru CS 653 Spring 2014 Jan 16, Thursday Lecture 4: Wireless Physical Layer: Channel Coding Mythili Vutukuru CS 653 Spring 2014 Jan 16, Thursday Channel Coding Modulated waveforms disrupted by signal propagation through wireless channel leads

More information

Lecture 1 Introduction

Lecture 1 Introduction Lecture 1 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 22, 2015 1 / 46 I-Hsiang Wang IT Lecture 1 Information Theory Information

More information

COURSE MATERIAL Subject Name: Communication Theory UNIT V

COURSE MATERIAL Subject Name: Communication Theory UNIT V NH-67, TRICHY MAIN ROAD, PULIYUR, C.F. - 639114, KARUR DT. DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING COURSE MATERIAL Subject Name: Communication Theory Subject Code: 080290020 Class/Sem:

More information

SHANNON showed that feedback does not increase the capacity

SHANNON showed that feedback does not increase the capacity IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 5, MAY 2011 2667 Feedback Capacity of the Gaussian Interference Channel to Within 2 Bits Changho Suh, Student Member, IEEE, and David N. C. Tse, Fellow,

More information

HUFFMAN CODING. Catherine Bénéteau and Patrick J. Van Fleet. SACNAS 2009 Mini Course. University of South Florida and University of St.

HUFFMAN CODING. Catherine Bénéteau and Patrick J. Van Fleet. SACNAS 2009 Mini Course. University of South Florida and University of St. Catherine Bénéteau and Patrick J. Van Fleet University of South Florida and University of St. Thomas SACNAS 2009 Mini Course WEDNESDAY, 14 OCTOBER, 2009 (1:40-3:00) LECTURE 2 SACNAS 2009 1 / 10 All lecture

More information

MULTILEVEL CODING (MLC) with multistage decoding

MULTILEVEL CODING (MLC) with multistage decoding 350 IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 52, NO. 3, MARCH 2004 Power- and Bandwidth-Efficient Communications Using LDPC Codes Piraporn Limpaphayom, Student Member, IEEE, and Kim A. Winick, Senior

More information

SCHEME OF COURSE WORK. Course Code : 13EC1114 L T P C : ELECTRONICS AND COMMUNICATION ENGINEERING

SCHEME OF COURSE WORK. Course Code : 13EC1114 L T P C : ELECTRONICS AND COMMUNICATION ENGINEERING SCHEME OF COURSE WORK Course Details: Course Title : DIGITAL COMMUNICATIONS Course Code : 13EC1114 L T P C 4 0 0 3 Program Specialization Semester Prerequisites Courses to which it is a prerequisite :

More information

Course Developer: Ranjan Bose, IIT Delhi

Course Developer: Ranjan Bose, IIT Delhi Course Title: Coding Theory Course Developer: Ranjan Bose, IIT Delhi Part I Information Theory and Source Coding 1. Source Coding 1.1. Introduction to Information Theory 1.2. Uncertainty and Information

More information

Review: Our Approach 2. CSC310 Information Theory

Review: Our Approach 2. CSC310 Information Theory CSC30 Informaton Theory Sam Rowes Lecture 3: Provng the Kraft-McMllan Inequaltes September 8, 6 Revew: Our Approach The study of both compresson and transmsson requres that we abstract data and messages

More information

THIS paper addresses the interference channel with a

THIS paper addresses the interference channel with a IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 6, NO. 8, AUGUST 07 599 The Degrees of Freedom of the Interference Channel With a Cognitive Relay Under Delayed Feedback Hyo Seung Kang, Student Member, IEEE,

More information

An Efficient Approach for Image Compression using Segmented Probabilistic Encoding with Shanon Fano[SPES].

An Efficient Approach for Image Compression using Segmented Probabilistic Encoding with Shanon Fano[SPES]. An Efficient Approach for Compression using Segmented Probabilistic Encoding with Shanon Fano[SPES]. Dr. T. Bhaskara Reddy 1, Miss. Hema Suresh Yaragunti 2, Mr. T. Sri Harish Reddy 3, Dr. S. Kiran 4 1

More information

Hamming Codes as Error-Reducing Codes

Hamming Codes as Error-Reducing Codes Hamming Codes as Error-Reducing Codes William Rurik Arya Mazumdar Abstract Hamming codes are the first nontrivial family of error-correcting codes that can correct one error in a block of binary symbols.

More information

Digital Communication Systems ECS 452

Digital Communication Systems ECS 452 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Source Coding 1 Office Hours: BKD 3601-7 Monday 14:00-16:00 Wednesday 14:40-16:00 Noise & Interference Elements

More information

Chapter 3 Convolutional Codes and Trellis Coded Modulation

Chapter 3 Convolutional Codes and Trellis Coded Modulation Chapter 3 Convolutional Codes and Trellis Coded Modulation 3. Encoder Structure and Trellis Representation 3. Systematic Convolutional Codes 3.3 Viterbi Decoding Algorithm 3.4 BCJR Decoding Algorithm 3.5

More information

The Degrees of Freedom of Full-Duplex. Bi-directional Interference Networks with and without a MIMO Relay

The Degrees of Freedom of Full-Duplex. Bi-directional Interference Networks with and without a MIMO Relay The Degrees of Freedom of Full-Duplex 1 Bi-directional Interference Networks with and without a MIMO Relay Zhiyu Cheng, Natasha Devroye, Tang Liu University of Illinois at Chicago zcheng3, devroye, tliu44@uic.edu

More information

Rab Nawaz. Prof. Zhang Wenyi

Rab Nawaz. Prof. Zhang Wenyi Rab Nawaz PhD Scholar (BL16006002) School of Information Science and Technology University of Science and Technology of China, Hefei Email: rabnawaz@mail.ustc.edu.cn Submitted to Prof. Zhang Wenyi wenyizha@ustc.edu.cn

More information

MATH CIRCLE, 10/13/2018

MATH CIRCLE, 10/13/2018 MATH CIRCLE, 10/13/2018 LARGE SOLUTIONS 1. Write out row 8 of Pascal s triangle. Solution. 1 8 28 56 70 56 28 8 1. 2. Write out all the different ways you can choose three letters from the set {a, b, c,

More information

Distributed LT Codes

Distributed LT Codes Distributed LT Codes Srinath Puducheri, Jörg Kliewer, and Thomas E. Fuja Department of Electrical Engineering, University of Notre Dame, Notre Dame, IN 46556, USA Email: {spuduche, jliewer, tfuja}@nd.edu

More information

Keywords Audio Steganography, Compressive Algorithms, SNR, Capacity, Robustness. (Figure 1: The Steganographic operation) [10]

Keywords Audio Steganography, Compressive Algorithms, SNR, Capacity, Robustness. (Figure 1: The Steganographic operation) [10] Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Audio Steganography

More information

The Z Channel. Nihar Jindal Department of Electrical Engineering Stanford University, Stanford, CA

The Z Channel. Nihar Jindal Department of Electrical Engineering Stanford University, Stanford, CA The Z Channel Sriram Vishwanath Dept. of Elec. and Computer Engg. Univ. of Texas at Austin, Austin, TX E-mail : sriram@ece.utexas.edu Nihar Jindal Department of Electrical Engineering Stanford University,

More information

Fan in: The number of inputs of a logic gate can handle.

Fan in: The number of inputs of a logic gate can handle. Subject Code: 17333 Model Answer Page 1/ 29 Important Instructions to examiners: 1) The answers should be examined by key words and not as word-to-word as given in the model answer scheme. 2) The model

More information