Introduction to Source Coding
|
|
- Zoe Atkinson
- 6 years ago
- Views:
Transcription
1 Comm. 52: Communication Theory Lecture 7 Introduction to Source Coding - Requirements of source codes - Huffman Code
2 Length Fixed Length Variable Length Source Code Properties Uniquely Decodable allow user to invert mapping to the original. Prefix No codeword can be a beginning of any other codeword. A prefix-free encoding is useful because it is self-defining: if we have any string of symbols, there cannot be two ways to decode it.
3 Examples: Not uniquely decodable Codes Code : symbols A and C are assigned the same binary sequence. Thus, the first requirement of a useful code is that each symbol be assigned a unique binary sequence. Code 2: is also not uniquely decodable because it is confusing in decoding. Source Code Code 2 Codeword Codeword A B C D
4 Prefix Coding A prefix code is defined as a code in which no codeword is the beginning of another codeword. A prefix code is uniquely decodable but the converse is not true. Code C is uniquely decodable since bit indicates the beginning of each code word in the code. Source Prefix Code Code A Code B Code C Codeword Codeword Codeword s s s 2 s 3 Not Uniquely Decodable Nor Prefix Code Uniquely Decodable Codes
5 Example: Prefix-Uniquely decodable Prob. Code Code 2 Code 3 Code 4 A P[A]=/2 B P[B]=/4 C P[C]=/8 D P[D]=/6 E P[E]=/6 Average Length 3/6 3/6 3/6 33/6 HW: Uniquely decodable : Code, 2, 3. No confusion in decoding Determine Instantaneous the prefix decodable: codes and Code the, uniquely 3. No need decodable to look ahead. codes from Prefix codes are uniquely the decodable above codes??? but the converse is not true. *** Code 4????
6 Example: Decoding of a Prefix Code Decision Tree for Code B s Source s k Code B Codeword c k Initial State s s 2 s s s 2 s 3 s 3 Example : Decode Answer : s s 3 s 2 s s A prefix-free encoding is useful because it is self-defining: if we have any string of symbols, there cannot be two ways to decode it.
7 Example Using Prefix decoding tree, decode the following code Decoding Tree symbol code a b a b c d c r d r abracadabra =
8 Kraft-McMillan Inequality K k 2 v k A prefix code must satisfy the Kraft McMillan s inequality. If a code satisfies this inequality, it doesn t mean that the code is a prefix. For code D = 9/8 This means that Code D IS NOT A PREFIX CODE Source s k Codeword C k Code D Codeword Length v k s s 2 s 2 3 s 3 2 Proof in the BOOK (Not required)
9 Use of Kraft-McMillan Inequality We may use it if the number of symbols are large such that we cannot simply by inspection judge whether a given code is a prefix code or not. WHAT Kraft-McMillan Inequality Can Do: It can determine that a given code IS NOT A PREFIX CODE WHAT Kraft-McMillan Inequality Cannot Do: It cannot guarantee that a given code is indeed a prefix code
10 Example Source s k Codeword Code E Codeword Length For code E = Is code E a PREFIX code? NO WHY? s 3 is a beginning of s 2 c k v k s s 3 s 2 3 s 3 2
11 For a Prefix Code H H Shannon s First Theorem S L S L H S if P k 2 v k k What is the Efficiency η? η= if P 2 vk for some k η< k
12 Huffman Coding Huffman code is a prefix, variable length code that can achieve the shortest average code length for a given input alphabet. Huffman code constructs a binary tree starting with the probabilities of each symbol in the alphabet. The tree is built in a bottom-up manner The tree is then used to find the codeword for each symbol. An algorithm for finding the Huffman code for a given alphabet with associated probabilities is given in the following slide.
13 Huffman Encoding Algorithm. Source symbols are listed in order of decreasing probability. The two source symbols of lowest probability are assigned a and. This part of the step is referred to as a splitting stage. 2. These two source symbols are regarded as being combined into a new source symbol with probability equal to the sum of the two original probabilities. (The list of source symbols and therefore source statistics is thereby reduced in size by one). The probability of the new symbol is placed in the list in accordance with its value. 3. The steps are repeated until we are left with a final list of source statistics of only two for which a and a are assigned.
14 Huffman encoding: Example Use Probabilities to order coding priorities of letters Low probability get codes first (more bits) This smoothes out the information per bit Letter X X2 X3 X4 X5 X6 X7 Probability
15 Huffman encoding Use a code tree to make the code Combine the symbols with lowest probability to make a new block symbol Assign a to one of the old symbols code word and to the other symbol Now reorder and combine the two lowest probability symbols of the new set Each time the symbol has lowest probability the code words get shorter x x2 x3 x4 x5 x6 x7 D D D 2 D 3 D 4
16 Result: Huffman encoding Entropy: H(X) = 2. (The best possible average number of bits) Average code word length: _ 7 L v k P( k x k v k = number of bits per symbol ) 2.2 Compression ratio: CR=3/2.2=.357 Note: The average code word length approaches the entropy (fundamental limit) The average code word length does satisfy: S L S H H So the efficiency = H ( X _ L ) 95.5%
17 Huffman Coding: Example 2 Compute the Huffman Code for the source shown H S 4. log log log L Source s k Probability p k s. s.2 s 2.4 s 3.2 s 4.
18 Solution A Source s k Stage I s 2.4 s.2 s 3.2 s. s 4.
19 Solution A Source s k Stage I Stage II s s.2.2 s s..2 s 4.
20 Solution A Source s k Stage I Stage II Stage III s s s s..2 s 4.
21 Solution A Source s k Stage I Stage II Stage III Stage IV s s s s..2 s 4.
22 Solution A Source s k Stage I Stage II Stage III Stage IV s s s s..2 s 4.
23 As high as Possiblee Source s k Solution A Stage I Stage II Stage III Stage IV Code s s s s..2 s 4.
24 Solution A Cont d Source s k Probability p k Code word c k s. s.2 s 2.4 s 3.2 s 4. H S L S L S H H THIS IS NOT THE ONLY SOLUTION! CR. 364
25 As low as Possible Source s k Alternate Solution B Stage I Stage II Stage III Stage IV Code s s s s..2 s 4.
26 Alternative Solution B Cont d Source s k Probability p k Code word c k s. s.2 s 2.4 s 3.2 s 4. H S L S L S H H CR
27 What is the difference between the two solutions? They have the same average code length They differ in the variance of the code length 2 K k P k ( v k L) 2 Solution A σ 2 =.6 Solution B σ 2 =.36 both have the same CR: CR
28 Compute Entropy (H) Exercise Build Huffman tree Compute average code length Compression ratio CR (S) P (S) A. B.2 C.4 D.2 E. Code BCCADE
29 Solution Compute Entropy (H) H = 2. bits Build Huffman tree Compute code length - L = 2.2 bits CR=3/2.2=.364 P(S) Code A. B.2 C.4 D.2 E. Code BCCADE =>
30 Properties of Huffman Codes The Huffman coding technique is optimal because: - It satisfies prefix condition (so decoding is unambiguous) - It has small average code length (approach minimum). In Huffman encoding, symbols that occur more frequently have shorter Huffman codes (but we must know the probabilities of each symbol for this to be true) Huffman encoding process (Huffman tree) is not unique. The code with the lowest code length variance is the better one.
Lecture5: Lossless Compression Techniques
Fixed to fixed mapping: we encoded source symbols of fixed length into fixed length code sequences Fixed to variable mapping: we encoded source symbols of fixed length into variable length code sequences
More informationInformation Theory and Communication Optimal Codes
Information Theory and Communication Optimal Codes Ritwik Banerjee rbanerjee@cs.stonybrook.edu c Ritwik Banerjee Information Theory and Communication 1/1 Roadmap Examples and Types of Codes Kraft Inequality
More informationEntropy, Coding and Data Compression
Entropy, Coding and Data Compression Data vs. Information yes, not, yes, yes, not not In ASCII, each item is 3 8 = 24 bits of data But if the only possible answers are yes and not, there is only one bit
More informationCommunication Theory II
Communication Theory II Lecture 13: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 22 th, 2015 1 o Source Code Generation Lecture Outlines Source Coding
More informationComm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding
Comm. 50: Communication Theory Lecture 6 - Introduction to Source Coding Digital Communication Systems Source of Information User of Information Source Encoder Source Decoder Channel Encoder Channel Decoder
More informationLECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR
1 LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 2 STORAGE SPACE Uncompressed graphics, audio, and video data require substantial storage capacity. Storing uncompressed video is not possible
More information# 12 ECE 253a Digital Image Processing Pamela Cosman 11/4/11. Introductory material for image compression
# 2 ECE 253a Digital Image Processing Pamela Cosman /4/ Introductory material for image compression Motivation: Low-resolution color image: 52 52 pixels/color, 24 bits/pixel 3/4 MB 3 2 pixels, 24 bits/pixel
More informationInformation Theory and Huffman Coding
Information Theory and Huffman Coding Consider a typical Digital Communication System: A/D Conversion Sampling and Quantization D/A Conversion Source Encoder Source Decoder bit stream bit stream Channel
More informationModule 3 Greedy Strategy
Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main
More informationA Brief Introduction to Information Theory and Lossless Coding
A Brief Introduction to Information Theory and Lossless Coding 1 INTRODUCTION This document is intended as a guide to students studying 4C8 who have had no prior exposure to information theory. All of
More informationModule 3 Greedy Strategy
Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main
More informationHuffman Coding - A Greedy Algorithm. Slides based on Kevin Wayne / Pearson-Addison Wesley
- A Greedy Algorithm Slides based on Kevin Wayne / Pearson-Addison Wesley Greedy Algorithms Greedy Algorithms Build up solutions in small steps Make local decisions Previous decisions are never reconsidered
More informationCoding for Efficiency
Let s suppose that, over some channel, we want to transmit text containing only 4 symbols, a, b, c, and d. Further, let s suppose they have a probability of occurrence in any block of text we send as follows
More informationChapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication
1 Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING 1.1 SOURCE CODING Whether a source is analog or digital, a digital communication system is designed to transmit information in digital form.
More informationMAS160: Signals, Systems & Information for Media Technology. Problem Set 4. DUE: October 20, 2003
MAS160: Signals, Systems & Information for Media Technology Problem Set 4 DUE: October 20, 2003 Instructors: V. Michael Bove, Jr. and Rosalind Picard T.A. Jim McBride Problem 1: Simple Psychoacoustic Masking
More informationGENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE
GENERIC CODE DESIGN ALGORITHMS FOR REVERSIBLE VARIABLE-LENGTH CODES FROM THE HUFFMAN CODE Wook-Hyun Jeong and Yo-Sung Ho Kwangju Institute of Science and Technology (K-JIST) Oryong-dong, Buk-gu, Kwangju,
More informationSolutions to Assignment-2 MOOC-Information Theory
Solutions to Assignment-2 MOOC-Information Theory 1. Which of the following is a prefix-free code? a) 01, 10, 101, 00, 11 b) 0, 11, 01 c) 01, 10, 11, 00 Solution:- The codewords of (a) are not prefix-free
More informationMonday, February 2, Is assigned today. Answers due by noon on Monday, February 9, 2015.
Monday, February 2, 2015 Topics for today Homework #1 Encoding checkers and chess positions Constructing variable-length codes Huffman codes Homework #1 Is assigned today. Answers due by noon on Monday,
More information1 This work was partially supported by NSF Grant No. CCR , and by the URI International Engineering Program.
Combined Error Correcting and Compressing Codes Extended Summary Thomas Wenisch Peter F. Swaszek Augustus K. Uht 1 University of Rhode Island, Kingston RI Submitted to International Symposium on Information
More informationWednesday, February 1, 2017
Wednesday, February 1, 2017 Topics for today Encoding game positions Constructing variable-length codes Huffman codes Encoding Game positions Some programs that play two-player games (e.g., tic-tac-toe,
More informationMultimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology
Course Presentation Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology Data Compression Motivation Data storage and transmission cost money Use fewest number of
More informationSOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON).
SOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON). 1. Some easy problems. 1.1. Guessing a number. Someone chose a number x between 1 and N. You are allowed to ask questions: Is this number larger
More informationCompression. Encryption. Decryption. Decompression. Presentation of Information to client site
DOCUMENT Anup Basu Audio Image Video Data Graphics Objectives Compression Encryption Network Communications Decryption Decompression Client site Presentation of Information to client site Multimedia -
More informationCommunication Theory II
Communication Theory II Lecture 14: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 25 th, 2015 1 Previous Lecture: Source Code Generation: Lossless
More information6.450: Principles of Digital Communication 1
6.450: Principles of Digital Communication 1 Digital Communication: Enormous and normally rapidly growing industry, roughly comparable in size to the computer industry. Objective: Study those aspects of
More informationUCSD ECE154C Handout #21 Prof. Young-Han Kim Thursday, April 28, Midterm Solutions (Prepared by TA Shouvik Ganguly)
UCSD ECE54C Handout #2 Prof. Young-Han Kim Thursday, April 28, 26 Midterm Solutions (Prepared by TA Shouvik Ganguly) There are 3 problems, each problem with multiple parts, each part worth points. Your
More informationModule 8: Video Coding Basics Lecture 40: Need for video coding, Elements of information theory, Lossless coding. The Lecture Contains:
The Lecture Contains: The Need for Video Coding Elements of a Video Coding System Elements of Information Theory Symbol Encoding Run-Length Encoding Entropy Encoding file:///d /...Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2040/40_1.htm[12/31/2015
More informationGreedy Algorithms. Kleinberg and Tardos, Chapter 4
Greedy Algorithms Kleinberg and Tardos, Chapter 4 1 Selecting gas stations Road trip from Fort Collins to Durango on a given route with length L, and fuel stations at positions b i. Fuel capacity = C miles.
More informationThe idea of similarity is through the Hamming
Hamming distance A good channel code is designed so that, if a few bit errors occur in transmission, the output can still be identified as the correct input. This is possible because although incorrect,
More informationSYLLABUS of the course BASIC ELECTRONICS AND DIGITAL SIGNAL PROCESSING. Master in Computer Science, University of Bolzano-Bozen, a.y.
SYLLABUS of the course BASIC ELECTRONICS AND DIGITAL SIGNAL PROCESSING Master in Computer Science, University of Bolzano-Bozen, a.y. 2017-2018 Lecturer: LEONARDO RICCI (last updated on November 27, 2017)
More informationMAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007
MIT OpenCourseWare http://ocw.mit.edu MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007 For information about citing these materials or our Terms of Use, visit:
More informationDCSP-3: Minimal Length Coding. Jianfeng Feng
DCSP-3: Minimal Length Coding Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html Automatic Image Caption (better than
More informationThe Lempel-Ziv (LZ) lossless compression algorithm was developed by Jacob Ziv (AT&T Bell Labs / Technion Israel) and Abraham Lempel (IBM) in 1978;
Georgia Institute of Technology - Georgia Tech Lorraine ECE 6605 Information Theory Lempel-Ziv Lossless Compresion General comments The Lempel-Ziv (LZ) lossless compression algorithm was developed by Jacob
More informationSpeech Coding in the Frequency Domain
Speech Coding in the Frequency Domain Speech Processing Advanced Topics Tom Bäckström Aalto University October 215 Introduction The speech production model can be used to efficiently encode speech signals.
More informationCOMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam
German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Dr.-Ing. Heiko Schwarz COMM901 Source Coding and Compression Winter Semester
More informationRun-Length Based Huffman Coding
Chapter 5 Run-Length Based Huffman Coding This chapter presents a multistage encoding technique to reduce the test data volume and test power in scan-based test applications. We have proposed a statistical
More informationCSE 100: BST AVERAGE CASE AND HUFFMAN CODES
CSE 100: BST AVERAGE CASE AND HUFFMAN CODES Recap: Average Case Analysis of successful find in a BST N nodes Expected total depth of all BSTs with N nodes Recap: Probability of having i nodes in the left
More informationExercises to Chapter 2 solutions
Exercises to Chapter 2 solutions 1 Exercises to Chapter 2 solutions E2.1 The Manchester code was first used in Manchester Mark 1 computer at the University of Manchester in 1949 and is still used in low-speed
More informationComputing and Communications 2. Information Theory -Channel Capacity
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Channel Capacity Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Communication
More informationThe ternary alphabet is used by alternate mark inversion modulation; successive ones in data are represented by alternating ±1.
Alphabets EE 387, Notes 2, Handout #3 Definition: An alphabet is a discrete (usually finite) set of symbols. Examples: B = {0,1} is the binary alphabet T = { 1,0,+1} is the ternary alphabet X = {00,01,...,FF}
More informationEntropy Coding. Outline. Entropy. Definitions. log. A = {a, b, c, d, e}
Outline efinition of ntroy Three ntroy coding techniques: Huffman coding rithmetic coding Lemel-Ziv coding ntroy oding (taken from the Technion) ntroy ntroy of a set of elements e,,e n with robabilities,
More informationWhat You ll Learn Today
CS101 Lecture 18: Image Compression Aaron Stevens 21 October 2010 Some material form Wikimedia Commons Special thanks to John Magee and his dog 1 What You ll Learn Today Review: how big are image files?
More informationLossless Image Compression Techniques Comparative Study
Lossless Image Compression Techniques Comparative Study Walaa Z. Wahba 1, Ashraf Y. A. Maghari 2 1M.Sc student, Faculty of Information Technology, Islamic university of Gaza, Gaza, Palestine 2Assistant
More informationSlides credited from Hsueh-I Lu, Hsu-Chun Hsiao, & Michael Tsai
Slides credited from Hsueh-I Lu, Hsu-Chun Hsiao, & Michael Tsai Mini-HW 6 Released Due on 11/09 (Thu) 17:20 Homework 2 Due on 11/09 (Thur) 17:20 Midterm Time: 11/16 (Thur) 14:20-17:20 Format: close book
More informationDigital Communication Systems ECS 452
Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 2. Source Coding 1 Office Hours: BKD, 6th floor of Sirindhralai building Monday 10:00-10:40 Tuesday 12:00-12:40
More information6.02 Introduction to EECS II Spring Quiz 1
M A S S A C H U S E T T S I N S T I T U T E O F T E C H N O L O G Y DEPARTMENT OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE 6.02 Introduction to EECS II Spring 2011 Quiz 1 Name SOLUTIONS Score Please
More informationDEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS
DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK Subject Name: Year /Sem: II / IV UNIT I INFORMATION ENTROPY FUNDAMENTALS PART A (2 MARKS) 1. What is uncertainty? 2. What is prefix coding? 3. State the
More informationLecture 1 Introduction
Lecture 1 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 22, 2015 1 / 46 I-Hsiang Wang IT Lecture 1 Information Theory Information
More information(a) (b) (c) (d) (e) (a) (b) (c) (d) (e)
Exercises 97 Exercises Exercise 2. Write a oolean equation in sum-of-products canonical form for each of the truth tables in Figure 2.8. (d) (e) C C C D Figure 2.8 Truth tables for Exercises 2. and 2.3
More informationHUFFMAN CODING. Catherine Bénéteau and Patrick J. Van Fleet. SACNAS 2009 Mini Course. University of South Florida and University of St.
Catherine Bénéteau and Patrick J. Van Fleet University of South Florida and University of St. Thomas SACNAS 2009 Mini Course WEDNESDAY, 14 OCTOBER, 2009 (1:40-3:00) LECTURE 2 SACNAS 2009 1 / 10 All lecture
More informationChapter 6: Memory: Information and Secret Codes. CS105: Great Insights in Computer Science
Chapter 6: Memory: Information and Secret Codes CS105: Great Insights in Computer Science Overview When we decide how to represent something in bits, there are some competing interests: easily manipulated/processed
More informationCHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES
119 CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES 5.1 INTRODUCTION In this work the peak powers of the OFDM signal is reduced by applying Adaptive Huffman Codes (AHC). First the encoding
More informationChannel Coding/Decoding. Hamming Method
Channel Coding/Decoding Hamming Method INFORMATION TRANSFER ACROSS CHANNELS Sent Received messages symbols messages source encoder Source coding Channel coding Channel Channel Source decoder decoding decoding
More informationPublished by: PIONEER RESEARCH & DEVELOPMENT GROUP ( 1
VHDL design of lossy DWT based image compression technique for video conferencing Anitha Mary. M 1 and Dr.N.M. Nandhitha 2 1 VLSI Design, Sathyabama University Chennai, Tamilnadu 600119, India 2 ECE, Sathyabama
More informationComputer Science 1001.py. Lecture 25 : Intro to Error Correction and Detection Codes
Computer Science 1001.py Lecture 25 : Intro to Error Correction and Detection Codes Instructors: Daniel Deutch, Amiram Yehudai Teaching Assistants: Michal Kleinbort, Amir Rubinstein School of Computer
More informationFREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY
1 Information Transmission Chapter 5, Block codes FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY 2 Methods of channel coding For channel coding (error correction) we have two main classes of codes,
More informationECEn 665: Antennas and Propagation for Wireless Communications 131. s(t) = A c [1 + αm(t)] cos (ω c t) (9.27)
ECEn 665: Antennas and Propagation for Wireless Communications 131 9. Modulation Modulation is a way to vary the amplitude and phase of a sinusoidal carrier waveform in order to transmit information. When
More informationEfficient and Compact Representations of Some Non-Canonical Prefix-Free Codes
Efficient and Compact Representations of Some Non-Canonical Prefix-Free Codes Antonio Fariña 1, Travis Gagie 2, Giovanni Manzini 3, Gonzalo Navarro 4, and Alberto Ordóñez 5 1 Database Laboratory, University
More informationB. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided.
" 11111111111111111111111111111111111111111111111111111111111111III *U-3091/8400* Printed Pages : 7 TEC - 601! I i B. Tech. (SEM. VI) EXAMINATION, 2007-08 DIGIT AL COMMUNICATION \ V Time: 3 Hours] [Total
More informationHuffman Coding For Digital Photography
Huffman Coding For Digital Photography Raydhitya Yoseph 13509092 Program Studi Teknik Informatika Sekolah Teknik Elektro dan Informatika Institut Teknologi Bandung, Jl. Ganesha 10 Bandung 40132, Indonesia
More informationLecture 9b Convolutional Coding/Decoding and Trellis Code modulation
Lecture 9b Convolutional Coding/Decoding and Trellis Code modulation Convolutional Coder Basics Coder State Diagram Encoder Trellis Coder Tree Viterbi Decoding For Simplicity assume Binary Sym.Channel
More informationFAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING
FAST LEMPEL-ZIV (LZ 78) COMPLEXITY ESTIMATION USING CODEBOOK HASHING Harman Jot, Rupinder Kaur M.Tech, Department of Electronics and Communication, Punjabi University, Patiala, Punjab, India I. INTRODUCTION
More informationExercise Problems: Information Theory and Coding
Exercise Problems: Information Theory and Coding Exercise 9 1. An error-correcting Hamming code uses a 7 bit block size in order to guarantee the detection, and hence the correction, of any single bit
More informationMulti-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 28-30, 2011 Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless Zhiyu Cheng, Natasha
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Encoded NW Decoders John E Savage Lecture Outline Encoded NW Decoders Axial and radial encoding Addressing Strategies All different, Most different, All present,
More informationEE521 Analog and Digital Communications
EE521 Analog and Digital Communications Questions Problem 1: SystemView... 3 Part A (25%... 3... 3 Part B (25%... 3... 3 Voltage... 3 Integer...3 Digital...3 Part C (25%... 3... 4 Part D (25%... 4... 4
More informationProblem Sheet 1 Probability, random processes, and noise
Problem Sheet 1 Probability, random processes, and noise 1. If F X (x) is the distribution function of a random variable X and x 1 x 2, show that F X (x 1 ) F X (x 2 ). 2. Use the definition of the cumulative
More informationECE Advanced Communication Theory, Spring 2007 Midterm Exam Monday, April 23rd, 6:00-9:00pm, ELAB 325
C 745 - Advanced Communication Theory, Spring 2007 Midterm xam Monday, April 23rd, 600-900pm, LAB 325 Overview The exam consists of five problems for 150 points. The points for each part of each problem
More informationDigital Audio. Lecture-6
Digital Audio Lecture-6 Topics today Digitization of sound PCM Lossless predictive coding 2 Sound Sound is a pressure wave, taking continuous values Increase / decrease in pressure can be measured in amplitude,
More informationCHAPTER 6: REGION OF INTEREST (ROI) BASED IMAGE COMPRESSION FOR RADIOGRAPHIC WELD IMAGES. Every image has a background and foreground detail.
69 CHAPTER 6: REGION OF INTEREST (ROI) BASED IMAGE COMPRESSION FOR RADIOGRAPHIC WELD IMAGES 6.0 INTRODUCTION Every image has a background and foreground detail. The background region contains details which
More information6.004 Computation Structures Spring 2009
MIT OpenCourseWare http://ocw.mit.edu 6.004 Computation Structures Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Welcome to 6.004! Course
More informationDigital Logic and Design (Course Code: EE222) Lecture 14: Combinational Contd.. Decoders/Encoders
Indian Institute of Technology Jodhpur, Year 28 29 Digital Logic and Design (Course Code: EE222) Lecture 4: Combinational Contd.. Decoders/Encoders Course Instructor: Shree Prakash Tiwari Email: sptiwari@iitj.ac.in
More informationCryptography. Module in Autumn Term 2016 University of Birmingham. Lecturers: Mark D. Ryan and David Galindo
Lecturers: Mark D. Ryan and David Galindo. Cryptography 2017. Slide: 1 Cryptography Module in Autumn Term 2016 University of Birmingham Lecturers: Mark D. Ryan and David Galindo Slides originally written
More informationInfo theory and big data
Info theory and big data Typical or not typical, that is the question Han Vinck University Duisburg Essen, Germany September 2016 A.J. Han Vinck, Yerevan, September 2016 Content: big data issues A definition:
More informationTSTE17 System Design, CDIO. General project hints. Behavioral Model. General project hints, cont. Lecture 5. Required documents Modulation, cont.
TSTE17 System Design, CDIO Lecture 5 1 General project hints 2 Project hints and deadline suggestions Required documents Modulation, cont. Requirement specification Channel coding Design specification
More informationHuffman Coding with Non-Sorted Frequencies
Huffman Coding with Non-Sorted Frequencies Shmuel T. Klein and Dana Shapira Abstract. A standard way of implementing Huffman s optimal code construction algorithm is by using a sorted sequence of frequencies.
More informationDiscrete Structures for Computer Science
Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #23: Discrete Probability Based on materials developed by Dr. Adam Lee The study of probability is
More informationDigital Communication Systems ECS 452
Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Source Coding 1 Office Hours: BKD 3601-7 Monday 14:00-16:00 Wednesday 14:40-16:00 Noise & Interference Elements
More informationMATHEMATICS IN COMMUNICATIONS: INTRODUCTION TO CODING. A Public Lecture to the Uganda Mathematics Society
Abstract MATHEMATICS IN COMMUNICATIONS: INTRODUCTION TO CODING A Public Lecture to the Uganda Mathematics Society F F Tusubira, PhD, MUIPE, MIEE, REng, CEng Mathematical theory and techniques play a vital
More informationThe study of probability is concerned with the likelihood of events occurring. Many situations can be analyzed using a simplified model of probability
The study of probability is concerned with the likelihood of events occurring Like combinatorics, the origins of probability theory can be traced back to the study of gambling games Still a popular branch
More informationOptimal Coded Information Network Design and Management via Improved Characterizations of the Binary Entropy Function
Optimal Coded Information Network Design and Management via Improved Characterizations of the Binary Entropy Function John MacLaren Walsh & Steven Weber Department of Electrical and Computer Engineering
More informationA High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction
1514 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 10, NO. 8, DECEMBER 2000 A High-Throughput Memory-Based VLC Decoder with Codeword Boundary Prediction Bai-Jue Shieh, Yew-San Lee,
More informationS Coding Methods (5 cr) P. Prerequisites. Literature (1) Contents
S-72.3410 Introduction 1 S-72.3410 Introduction 3 S-72.3410 Coding Methods (5 cr) P Lectures: Mondays 9 12, room E110, and Wednesdays 9 12, hall S4 (on January 30th this lecture will be held in E111!)
More informationDigital Television Lecture 5
Digital Television Lecture 5 Forward Error Correction (FEC) Åbo Akademi University Domkyrkotorget 5 Åbo 8.4. Error Correction in Transmissions Need for error correction in transmissions Loss of data during
More informationSets. Gazihan Alankuş (Based on original slides by Brahim Hnich et al.) August 6, Outline Sets Equality Subset Empty Set Cardinality Power Set
Gazihan Alankuş (Based on original slides by Brahim Hnich et al.) August 6, 2012 Gazihan Alankuş (Based on original slides by Brahim Hnich et al.) Gazihan Alankuş (Based on original slides by Brahim Hnich
More informationMIDTERM REVIEW INDU 421 (Fall 2013)
MIDTERM REVIEW INDU 421 (Fall 2013) Problem #1: A job shop has received on order for high-precision formed parts. The cost of producing each part is estimated to be $65,000. The customer requires that
More informationHuffman-Compressed Wavelet Trees for Large Alphabets
Laboratorio de Bases de Datos Facultade de Informática Universidade da Coruña Departamento de Ciencias de la Computación Universidad de Chile Huffman-Compressed Wavelet Trees for Large Alphabets Gonzalo
More informationMultiuser Information Theory and Wireless Communications. Professor in Charge: Toby Berger Principal Lecturer: Jun Chen
Multiuser Information Theory and Wireless Communications Professor in Charge: Toby Berger Principal Lecturer: Jun Chen Where and When? 1 Good News No homework. No exam. 2 Credits:1-2 One credit: submit
More informationCSE 312: Foundations of Computing II Quiz Section #2: Inclusion-Exclusion, Pigeonhole, Introduction to Probability (solutions)
CSE 31: Foundations of Computing II Quiz Section #: Inclusion-Exclusion, Pigeonhole, Introduction to Probability (solutions) Review: Main Theorems and Concepts Binomial Theorem: x, y R, n N: (x + y) n
More informationReview: Our Approach 2. CSC310 Information Theory
CSC30 Informaton Theory Sam Rowes Lecture 3: Provng the Kraft-McMllan Inequaltes September 8, 6 Revew: Our Approach The study of both compresson and transmsson requres that we abstract data and messages
More informationBackground Dirty Paper Coding Codeword Binning Code construction Remaining problems. Information Hiding. Phil Regalia
Information Hiding Phil Regalia Department of Electrical Engineering and Computer Science Catholic University of America Washington, DC 20064 regalia@cua.edu Baltimore IEEE Signal Processing Society Chapter,
More informationCOMP Online Algorithms. Paging and k-server Problem. Shahin Kamali. Lecture 9 - Oct. 4, 2018 University of Manitoba
COMP 7720 - Online Algorithms Paging and k-server Problem Shahin Kamali Lecture 9 - Oct. 4, 2018 University of Manitoba COMP 7720 - Online Algorithms Paging and k-server Problem 1 / 20 Review & Plan COMP
More informationEELE 6333: Wireless Commuications
EELE 6333: Wireless Commuications Chapter # 4 : Capacity of Wireless Channels Spring, 2012/2013 EELE 6333: Wireless Commuications - Ch.4 Dr. Musbah Shaat 1 / 18 Outline 1 Capacity in AWGN 2 Capacity of
More informationLanguage of Instruction Course Level Short Cycle ( ) First Cycle (x) Second Cycle ( ) Third Cycle ( ) Term Local Credit ECTS Credit Fall 3 5
Course Details Course Name Telecommunications II Language of Instruction English Course Level Short Cycle ( ) First Cycle (x) Second Cycle ( ) Third Cycle ( ) Course Type Course Code Compulsory (x) Elective
More informationDVA325 Formal Languages, Automata and Models of Computation (FABER)
DVA325 Formal Languages, Automata and Models of Computation (FABER) Lecture 1 - Introduction School of Innovation, Design and Engineering Mälardalen University 11 November 2014 Abu Naser Masud FABER November
More informationTHE use of balanced codes is crucial for some information
A Construction for Balancing Non-Binary Sequences Based on Gray Code Prefixes Elie N. Mambou and Theo G. Swart, Senior Member, IEEE arxiv:70.008v [cs.it] Jun 07 Abstract We introduce a new construction
More informationPD-SETS FOR CODES RELATED TO FLAG-TRANSITIVE SYMMETRIC DESIGNS. Communicated by Behruz Tayfeh Rezaie. 1. Introduction
Transactions on Combinatorics ISSN (print): 2251-8657, ISSN (on-line): 2251-8665 Vol. 7 No. 1 (2018), pp. 37-50. c 2018 University of Isfahan www.combinatorics.ir www.ui.ac.ir PD-SETS FOR CODES RELATED
More informationEECS 473 Advanced Embedded Systems. Lecture 13 Start on Wireless
EECS 473 Advanced Embedded Systems Lecture 13 Start on Wireless Team status updates Losing track of who went last. Cyberspeaker VisibleLight Elevate Checkout SmartHaus Upcoming Last lecture this Thursday
More informationAlgorithms and Data Structures: Network Flows. 24th & 28th Oct, 2014
Algorithms and Data Structures: Network Flows 24th & 28th Oct, 2014 ADS: lects & 11 slide 1 24th & 28th Oct, 2014 Definition 1 A flow network consists of A directed graph G = (V, E). Flow Networks A capacity
More informationPROBABILITY AND STATISTICS Vol. II - Information Theory and Communication - Tibor Nemetz INFORMATION THEORY AND COMMUNICATION
INFORMATION THEORY AND COMMUNICATION Tibor Nemetz Rényi Mathematical Institute, Hungarian Academy of Sciences, Budapest, Hungary Keywords: Shannon theory, alphabet, capacity, (transmission) channel, channel
More information