Digital Communication Systems ECS 452
|
|
- Martin Reeves
- 5 years ago
- Views:
Transcription
1 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong 2. Source Coding 1 Office Hours: BKD, 6th floor of Sirindhralai building Monday 10:00-10:40 Tuesday 12:00-12:40 Thursday 14:20-15:30
2 Elements of digital commu. sys. Message Transmitter Information Source Recovered Message Destination Source Encoder Remove redundancy Source Decoder Channel Encoder Add systematic redundancy Receiver Channel Decoder Digital Modulator Digital Demodulator Channel Transmitted Signal Received Signal Noise & Interference 2
3 System Under Consideration Message Transmitter Information Source Recovered Message Destination Source Encoder Remove redundancy Source Decoder Channel Encoder Add systematic redundancy Receiver Channel Decoder Digital Modulator Digital Demodulator Channel Transmitted Signal Received Signal Noise & Interference 3
4 Main Reference Elements of Information Theory 2006, 2nd Edition Chapters 2, 4 and 5 the jewel in Stanford's crown One of the greatest information theorists since Claude Shannon (and the one most like Shannon in approach, clarity, and taste). 4
5 5 English Alphabet (Non-Technical Use)
6 US UK The ASCII Coded Character Set (American Standard Code for Information Interchange) [The ARRL Handbook for Radio Communications 2013]
7 Example: ASCII Encoder Character x Codeword c(x) E L O V MATLAB: >> M = 'LOVE'; >> X = dec2bin(m,7); >> X = reshape(x',1,numel(x)) X = Remark: numel(a) = prod(size(a)) (the number of elements in matrix A) 7 Information Source LOVE Source Encoder
8 English Redundancy: Ex. 1 J-st tr- t- r--d th-s s-nt-nc-. 8
9 English Redundancy: Ex. 2 yxx cxn xndxrstxnd whxt x xm wrxtxng xvxn xf x rxplxcx xll thx vxwxls wxth xn 'x' (t gts lttl hrdr f y dn't vn kn whr th vwls r). 9
10 English Redundancy: Ex. 3 To be, or xxx xx xx, xxxx xx xxx xxxxxxxx 10
11 11 Entropy Rate of Thai Text
12 Introduction to Data Compression 12 [ ]
13 Introduction to Data Compression 13 [ ]
14 ASCII: Source Alphabet of Size = 128 (American Standard Code for Information Interchange) [The ARRL Handbook for Radio Communications 2013]
15 15 Ex. Source alphabet of size = 4
16 Ex. DMS (1) X abcde,,,, p X x 1, x abcde,,,, 5 0, otherwise Information Source a c a c e c d b c e d a e e d a b b b d b b a a b e b e d c c e d b c e c a a c a a e a c c a a d c d e e a a c a a a b b c a e b b e d b c d e b c a e e d d c d a b c a b c d d e d c e a b a a c a d 16 Approximately 20% are letter a s [GenRV_Discrete_datasample_Ex1.m]
17 Ex. DMS (1) clear all; close all; S_X = 'abcde'; p_x = [1/5 1/5 1/5 1/5 1/5]; n = 100; MessageSequence = datasample(s_x,n,'weights',p_x) MessageSequence = reshape(messagesequence,10,10) >> GenRV_Discrete_datasample_Ex1 MessageSequence = eebbedddeceacdbcbedeecacaecedcaedabecccabbcccebdbbbeccbadeaaaecceccdaccedadabceddaceadacdaededcdcade MessageSequence = eeeabbacde eacebeeead bcadcccdce bdcacccaed ebabcbedac dceeeacadd dbccbdcbac deecdedcca eddcbaaedd cecabacdae 17 [GenRV_Discrete_datasample_Ex1.m]
18 Ex. DMS (2) X 1, 2,3, 4 Information Source p X x , x 1, 2 1, x 2, 4 1, x 3,4 8 0, otherwise 18 Approximately 50% are number 1 s [GenRV_Discrete_datasample_Ex2.m]
19 Ex. DMS (2) clear all; close all; S_X = [ ]; p_x = [1/2 1/4 1/8 1/8]; n = 20; MessageSequence = randsrc(1,n,[s_x;p_x]); %MessageSequence = datasample(s_x,n,'weights',p_x); rf = hist(messagesequence,s_x)/n; % Ref. Freq. calc. stem(s_x,rf,'rx','linewidth',2) % Plot Rel. Freq. hold on stem(s_x,p_x,'bo','linewidth',2) % Plot pmf xlim([min(s_x)-1,max(s_x)+1]) legend('rel. freq. from sim.','pmf p_x(x)') xlabel('x') grid on Rel. freq. from sim. pmf p X (x) x 19 [GenRV_Discrete_datasample_Ex2.m]
20 DMS in MATLAB clear all; close all; S_X = [ ]; p_x = [1/2 1/4 1/8 1/8]; n = 1e6; SourceString = randsrc(1,n,[s_x;p_x]); Alternatively, we can also use SourceString = datasample(s_x,n,'weights',p_x); 20 rf = hist(sourcestring,s_x)/n; % Ref. Freq. calc. stem(s_x,rf,'rx','linewidth',2) % Plot Rel. Freq. hold on stem(s_x,p_x,'bo','linewidth',2) % Plot pmf xlim([min(s_x)-1,max(s_x)+1]) legend('rel. freq. from sim.','pmf p_x(x)') xlabel('x') grid on [GenRV_Discrete_datasample_Ex.m]
21 A more realistic example of pmf: Relative freq. of letters in the English language 21 [
22 A more realistic example of pmf: Relative freq. of letters in the English language ordered by frequency 22 [
23 Example: ASCII Encoder Codebook Character x Codeword c(x) E L O V MATLAB: >> M = 'LOVE'; >> X = dec2bin(m,7); >> X = reshape(x',1,numel(x)) X = Remark: numel(a) = prod(size(a)) (the number of elements in matrix A) 23 Information Source LOVE Source Encoder c( L ) c( O ) c( V ) c( E )
24 The ASCII Coded Character Set [The ARRL Handbook for Radio Communications 2013]
25 A Byte (8 bits) vs. 7 bits >> dec2bin('i Love ECS452',7) ans = >> dec2bin('i Love ECS452',8) ans =
26 Geeky ways to express your love >> dec2bin('i Love You',8) >> dec2bin('i love you',8) ans = ans = &ga_filters=holidays+-supplies+valentine&ga_search_type=all&ga_view_type=gallery 26
27 27 Summary: Source Encoder Information Source source string LOVE Discrete Memoryless Source (DMS) The source alphabet is the collection of all possible source symbols. Each symbol that the source generates is assumed to be randomly selected from the source alphabet. Source Encoder An encoder is a function that maps each of the symbol in the source alphabet into a corresponding (binary) codeword. The list for such mapping is called the codebook. Source Symbol x Codeword c(x) E L O V w/o extension encoded string c( L ) c( O ) c( V ) c( E ) The codeword corresponding to a source symbol is denoted by. the length of Each codeword is constructed from a code alphabet. For binary codeword, the code alphabet is 0,1
28 Morse code (wired and wireless) Telegraph network Samuel Morse, 1838 A sequence of on-off tones (or, lights, or clicks) 28
29 Example 29 [
30 30 Example
31 Morse code: Key Idea Frequently-used characters are mapped to short codewords. 31 Relative frequencies of letters in the English language
32 Morse code: Key Idea Frequently-used characters (e,t) are mapped to short codewords. 32 Relative frequencies of letters in the English language
33 Morse code: Key Idea Frequently-used characters (e,t) are mapped to short codewords. Basic form of compression. 33
34 34 รห สมอร สภาษาไทย
35 Example: ASCII Encoder Character Codeword E L O V MATLAB: >> M = 'LOVE'; >> X = dec2bin(m,7); >> X = reshape(x',1,numel(x)) X = Information Source LOVE Source Encoder
36 Another Example of non-ud code Suppose we want to convey the sequence of outcomes from rolling a dice. x c(x) A sequence of throws such as is encoded as
37 Another Example of non-ud code Suppose we want to convey the sequence of outcomes from rolling a dice. x c(x) The encoded string 11 could be interpreted as 11: 1 1 3: 11 The encoded string 110 could be interpreted as 12: :
38 Another Example of non-ud code x c(x) A 1 B 011 C D 1110 E
39 Another Example of non-ud code x c(x) A 1 B 011 C D 1110 E Consider the encoded string It can be interpreted as CDB: BABE: [ ]
40 Game: 20 Questions 20 Questions is a classic game that has been played since the 19th century. One person thinks of something (an object, a person, an animal, etc.) The others playing can ask 20 questions in an effort to guess what it is. 40
41 41 20 Questions: Example
42 Shannon Fano coding Prof. Robert Fano ( ) Shannon Award (1976 ) Proposed in Shannon s A Mathematical Theory of Communication in 1948 The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). The transmission of information. Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. Should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon Fano Elias coding (also known as Elias coding), the precursor to arithmetic coding. 42
43 Huffman Code David Huffman ( ) Hamming Medal (1999) MIT, 1951 Information theory class taught by Professor Fano. Huffman and his classmates were given the choice of a term paper on the problem of finding the most efficient binary code. or a final exam. Huffman, unable to prove any codes were the most efficient, was about to give up and start studying for the final when he hit upon the idea of using a frequency-sorted binary tree and quickly proved this method the most efficient. Huffman avoided the major flaw of the suboptimal Shannon-Fano coding by building the tree from the bottom up instead of from the top down. 43
44 Huffman s paper (1952) 44 [D. A. Huffman, "A Method for the Construction of Minimum-Redundancy Codes," in Proceedings of the IRE, vol. 40, no. 9, pp , Sept ] [ ]
45 Summary: All codes Nonsingular codes 45 [ ] A good code must be uniquely decodable (UD). Difficult to check. [Defn 2.18] [2.24] Consider a special family of codes: prefix(-free) code. Always UD. Same as being instantaneous. [Defn 2.30] Huffman s recipe UD codes Prefix-free codes Huffman codes No codeword is a prefix of any other codeword. Each source symbol can be decoded as soon as we come to the end of the codeword corresponding to it Repeatedly combine the two least-likely (combined) symbols. Automatically give prefix-free code. [Defn 2.36] [2.37] For a given source s pmf, Huffman codes are optimal among all UD codes for that source.
46 Huffman coding 46 [ ]
47 Ex. Huffman Coding in MATLAB [Ex. 2.31] Observe that MATLAB automatically give the expected length of the codewords px = [ ]; % pmf of X SX = [1:length(pX)]; % Source Alphabet [dict,el] = huffmandict(sx,px); % Create codebook %% Pretty print the codebook. codebook = dict; for i = 1:length(codebook) codebook{i,2} = num2str(codebook{i,2}); end codebook %% Try to encode some random source string n = 5; % Number of source symbols to be generated sourcestring = randsrc(1,10,[sx; px]) % Create data using px encodedstring = huffmanenco(sourcestring,dict) % Encode the data 47 [Huffman_Demo_Ex1]
48 Ex. Huffman Coding in MATLAB codebook = [1] '0' [2] '1 0' [3] '1 1 1' [4] '1 1 0' sourcestring = encodedstring = [Huffman_Demo_Ex1]
49 Ex. Huffman Coding in MATLAB [Ex. 2.32] px = [ ]; % pmf of X SX = [1:length(pX)]; [dict,el] = huffmandict(sx,px); % Source Alphabet % Create codebook %% Pretty print the codebook. codebook = dict; for i = 1:length(codebook) codebook{i,2} = num2str(codebook{i,2}); end codebook EL The codewords can be different from our answers found earlier. The expected length is the same. 49 [Huffman_Demo_Ex2] >> Huffman_Demo_Ex2 codebook = EL = [1] '1' [2] '0 1' [3] ' ' [4] '0 0 1' [5] ' ' [6] ' '
50 Ex. Huffman Coding in MATLAB px = [1/8, 5/24, 7/24, 3/8]; % pmf of X SX = [1:length(pX)]; [dict,el] = huffmandict(sx,px); % Source Alphabet % Create codebook %% Pretty print the codebook. codebook = dict; for i = 1:length(codebook) codebook{i,2} = num2str(codebook{i,2}); end codebook EL [Exercise] 50 >> -px*(log2(px)).' ans = codebook = [1] '0 0 1' [2] '0 0 0' [3] '0 1' [4] '1' EL =
51 x
52 Entropy and Description of RV 57 [ ]
53 Entropy and Description of RV 58 [ ]
54 59 Summary: Optimality of Huffman Codes Consider a given DMS with known pmf [Defn 2.36] A code is optimal if it is UD and its corresponding expected length is the shortest among all possible UD codes for that source. [2.37] Huffman codes are optimal. All codes [ ] Bounds on expected lengths: Nonsingular codes UD codes Prefix-free codes Huffman codes Expected length Expected length (per source (per source symbol) of an symbol) of a 1 optimal code Huffman code
55 Summary: Entropy 60 Entropy measures the amount of uncertainty (randomness) in a RV. Three formulas for calculating entropy: [Defn 2.41] Given a pmf of a RV, binary entropy function log. [2.44] Given a probability vector, log. [Defn 2.47] Given a number, Set 0log 0 0. log 1 log 1 [2.56] Operational meaning: Entropy of a random variable is the average length of its shortest description.
56 Examples Example 2.31 H X Huffman 1.75 X Efficiency = 100% Example 2.32 H X Huffman X Efficiency 97% 61
57 Examples Example 2.33 H X Huffman X Efficiency 99% Example 2.34 A B C D H X Huffman X Efficiency 93% 62
58 Summary: Entropy Important Bounds deterministic uniform The entropy of a uniform (discrete) random variable: The entropy of a Bernoulli random variable: binary entropy function 63
59 [Ex.2.40] Huffman Coding: Source Extension 1 1 X k i.i.d. Bernoulli p p L n n: order of extension
60 [Ex.2.40] Huffman Coding: Source Extension 1.8 X k i.i.d. Bernoulli p p L n H X 1 n 67 H X Order of source extension n
61 Summary: Source Extension The encoder operates on the blocks rather than on individual symbols. [Defn 2.39] -th extension coding: 1 block = successive source symbols = expected (average) codeword length per source symbol when Huffman coding is used with -th extension H X 1 n 68 H X 0.6 L n Order of source extension
Digital Communication Systems ECS 452
Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Source Coding 1 Office Hours: BKD 3601-7 Monday 14:00-16:00 Wednesday 14:40-16:00 Noise & Interference Elements
More informationDigital Communication Systems ECS 452
Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong (ผศ.ดร.ประพ นธ ส ขสมปอง) prapun@siit.tu.ac.th 1. Intro to Digital Communication Systems 1 Office Hours: BKD, 6th floor of Sirindhralai
More informationPrinciples of Communications ECS 332
Principles of Communications ECS 332 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 5. Angle Modulation Office Hours: BKD, 6th floor of Sirindhralai building Wednesday 4:3-5:3 Friday 4:3-5:3 Example
More informationLecture5: Lossless Compression Techniques
Fixed to fixed mapping: we encoded source symbols of fixed length into fixed length code sequences Fixed to variable mapping: we encoded source symbols of fixed length into variable length code sequences
More informationDigital Communication Systems ECS 452
Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 5. Channel Coding 1 Office Hours: BKD, 6th floor of Sirindhralai building Tuesday 14:20-15:20 Wednesday 14:20-15:20
More informationLECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR
1 LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR 2 STORAGE SPACE Uncompressed graphics, audio, and video data require substantial storage capacity. Storing uncompressed video is not possible
More informationEntropy, Coding and Data Compression
Entropy, Coding and Data Compression Data vs. Information yes, not, yes, yes, not not In ASCII, each item is 3 8 = 24 bits of data But if the only possible answers are yes and not, there is only one bit
More informationComm. 502: Communication Theory. Lecture 6. - Introduction to Source Coding
Comm. 50: Communication Theory Lecture 6 - Introduction to Source Coding Digital Communication Systems Source of Information User of Information Source Encoder Source Decoder Channel Encoder Channel Decoder
More informationCOMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam
German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Dr.-Ing. Heiko Schwarz COMM901 Source Coding and Compression Winter Semester
More information# 12 ECE 253a Digital Image Processing Pamela Cosman 11/4/11. Introductory material for image compression
# 2 ECE 253a Digital Image Processing Pamela Cosman /4/ Introductory material for image compression Motivation: Low-resolution color image: 52 52 pixels/color, 24 bits/pixel 3/4 MB 3 2 pixels, 24 bits/pixel
More informationCommunication Theory II
Communication Theory II Lecture 13: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 22 th, 2015 1 o Source Code Generation Lecture Outlines Source Coding
More informationInformation Theory and Huffman Coding
Information Theory and Huffman Coding Consider a typical Digital Communication System: A/D Conversion Sampling and Quantization D/A Conversion Source Encoder Source Decoder bit stream bit stream Channel
More informationDCSP-3: Minimal Length Coding. Jianfeng Feng
DCSP-3: Minimal Length Coding Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html Automatic Image Caption (better than
More informationComputing and Communications 2. Information Theory -Channel Capacity
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Channel Capacity Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Communication
More informationMultimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology
Course Presentation Multimedia Systems Entropy Coding Mahdi Amiri February 2011 Sharif University of Technology Data Compression Motivation Data storage and transmission cost money Use fewest number of
More informationInformation Theory and Communication Optimal Codes
Information Theory and Communication Optimal Codes Ritwik Banerjee rbanerjee@cs.stonybrook.edu c Ritwik Banerjee Information Theory and Communication 1/1 Roadmap Examples and Types of Codes Kraft Inequality
More informationIntroduction to Source Coding
Comm. 52: Communication Theory Lecture 7 Introduction to Source Coding - Requirements of source codes - Huffman Code Length Fixed Length Variable Length Source Code Properties Uniquely Decodable allow
More information6.004 Computation Structures Spring 2009
MIT OpenCourseWare http://ocw.mit.edu 6.004 Computation Structures Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Welcome to 6.004! Course
More informationECE Advanced Communication Theory, Spring 2007 Midterm Exam Monday, April 23rd, 6:00-9:00pm, ELAB 325
C 745 - Advanced Communication Theory, Spring 2007 Midterm xam Monday, April 23rd, 600-900pm, LAB 325 Overview The exam consists of five problems for 150 points. The points for each part of each problem
More informationCoding for Efficiency
Let s suppose that, over some channel, we want to transmit text containing only 4 symbols, a, b, c, and d. Further, let s suppose they have a probability of occurrence in any block of text we send as follows
More informationA Brief Introduction to Information Theory and Lossless Coding
A Brief Introduction to Information Theory and Lossless Coding 1 INTRODUCTION This document is intended as a guide to students studying 4C8 who have had no prior exposure to information theory. All of
More informationLecture 1 Introduction
Lecture 1 Introduction I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw September 22, 2015 1 / 46 I-Hsiang Wang IT Lecture 1 Information Theory Information
More information4. Which of the following channel matrices respresent a symmetric channel? [01M02] 5. The capacity of the channel with the channel Matrix
Send SMS s : ONJntuSpeed To 9870807070 To Recieve Jntu Updates Daily On Your Mobile For Free www.strikingsoon.comjntu ONLINE EXMINTIONS [Mid 2 - dc] http://jntuk.strikingsoon.com 1. Two binary random
More informationDEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK. Subject Name: Information Coding Techniques UNIT I INFORMATION ENTROPY FUNDAMENTALS
DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK Subject Name: Year /Sem: II / IV UNIT I INFORMATION ENTROPY FUNDAMENTALS PART A (2 MARKS) 1. What is uncertainty? 2. What is prefix coding? 3. State the
More informationCommunication Theory II
Communication Theory II Lecture 14: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 25 th, 2015 1 Previous Lecture: Source Code Generation: Lossless
More informationScheduling in omnidirectional relay wireless networks
Scheduling in omnidirectional relay wireless networks by Shuning Wang A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of Applied Science
More informationKINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK
KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK SUB.NAME : COMMUNICATION THEORY SUB.CODE: EC1252 YEAR : II SEMESTER : IV UNIT I AMPLITUDE MODULATION SYSTEMS
More informationSIGNALS AND SYSTEMS LABORATORY 13: Digital Communication
SIGNALS AND SYSTEMS LABORATORY 13: Digital Communication INTRODUCTION Digital Communication refers to the transmission of binary, or digital, information over analog channels. In this laboratory you will
More informationHamming net based Low Complexity Successive Cancellation Polar Decoder
Hamming net based Low Complexity Successive Cancellation Polar Decoder [1] Makarand Jadhav, [2] Dr. Ashok Sapkal, [3] Prof. Ram Patterkine [1] Ph.D. Student, [2] Professor, Government COE, Pune, [3] Ex-Head
More informationSHANNON S source channel separation theorem states
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 9, SEPTEMBER 2009 3927 Source Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Member, IEEE, Elza Erkip, Senior Member,
More informationChapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING. Whether a source is analog or digital, a digital communication
1 Chapter 1 INTRODUCTION TO SOURCE CODING AND CHANNEL CODING 1.1 SOURCE CODING Whether a source is analog or digital, a digital communication system is designed to transmit information in digital form.
More informationSCHEME OF COURSE WORK. Course Code : 13EC1114 L T P C : ELECTRONICS AND COMMUNICATION ENGINEERING
SCHEME OF COURSE WORK Course Details: Course Title : DIGITAL COMMUNICATIONS Course Code : 13EC1114 L T P C 4 0 0 3 Program Specialization Semester Prerequisites Courses to which it is a prerequisite :
More informationHuffman Coding - A Greedy Algorithm. Slides based on Kevin Wayne / Pearson-Addison Wesley
- A Greedy Algorithm Slides based on Kevin Wayne / Pearson-Addison Wesley Greedy Algorithms Greedy Algorithms Build up solutions in small steps Make local decisions Previous decisions are never reconsidered
More informationRab Nawaz. Prof. Zhang Wenyi
Rab Nawaz PhD Scholar (BL16006002) School of Information Science and Technology University of Science and Technology of China, Hefei Email: rabnawaz@mail.ustc.edu.cn Submitted to Prof. Zhang Wenyi wenyizha@ustc.edu.cn
More informationBlock Markov Encoding & Decoding
1 Block Markov Encoding & Decoding Deqiang Chen I. INTRODUCTION Various Markov encoding and decoding techniques are often proposed for specific channels, e.g., the multi-access channel (MAC) with feedback,
More informationEE303: Communication Systems
EE303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London An Overview of Fundamentals: Channels, Criteria and Limits Prof. A. Manikas (Imperial
More informationTCET3202 Analog and digital Communications II
NEW YORK CITY COLLEGE OF TECHNOLOGY The City University of New York DEPARTMENT: SUBJECT CODE AND TITLE: COURSE DESCRIPTION: REQUIRED COURSE Electrical and Telecommunications Engineering Technology TCET3202
More informationMAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007
MIT OpenCourseWare http://ocw.mit.edu MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007 For information about citing these materials or our Terms of Use, visit:
More information6.450: Principles of Digital Communication 1
6.450: Principles of Digital Communication 1 Digital Communication: Enormous and normally rapidly growing industry, roughly comparable in size to the computer industry. Objective: Study those aspects of
More informationRevision of Lecture Eleven
Revision of Lecture Eleven Previous lecture we have concentrated on carrier recovery for QAM, and modified early-late clock recovery for multilevel signalling as well as star 16QAM scheme Thus we have
More informationConvolutional Coding Using Booth Algorithm For Application in Wireless Communication
Available online at www.interscience.in Convolutional Coding Using Booth Algorithm For Application in Wireless Communication Sishir Kalita, Parismita Gogoi & Kandarpa Kumar Sarma Department of Electronics
More informationError Control Coding. Aaron Gulliver Dept. of Electrical and Computer Engineering University of Victoria
Error Control Coding Aaron Gulliver Dept. of Electrical and Computer Engineering University of Victoria Topics Introduction The Channel Coding Problem Linear Block Codes Cyclic Codes BCH and Reed-Solomon
More informationB. Tech. (SEM. VI) EXAMINATION, (2) All question early equal make. (3) In ease of numerical problems assume data wherever not provided.
" 11111111111111111111111111111111111111111111111111111111111111III *U-3091/8400* Printed Pages : 7 TEC - 601! I i B. Tech. (SEM. VI) EXAMINATION, 2007-08 DIGIT AL COMMUNICATION \ V Time: 3 Hours] [Total
More informationPart A: Question & Answers UNIT I AMPLITUDE MODULATION
PANDIAN SARASWATHI YADAV ENGINEERING COLLEGE DEPARTMENT OF ELECTRONICS & COMMUNICATON ENGG. Branch: ECE EC6402 COMMUNICATION THEORY Semester: IV Part A: Question & Answers UNIT I AMPLITUDE MODULATION 1.
More informationMulti-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 28-30, 2011 Multi-user Two-way Deterministic Modulo 2 Adder Channels When Adaptation Is Useless Zhiyu Cheng, Natasha
More informationProblem Sheet 1 Probability, random processes, and noise
Problem Sheet 1 Probability, random processes, and noise 1. If F X (x) is the distribution function of a random variable X and x 1 x 2, show that F X (x 1 ) F X (x 2 ). 2. Use the definition of the cumulative
More informationSOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON).
SOME EXAMPLES FROM INFORMATION THEORY (AFTER C. SHANNON). 1. Some easy problems. 1.1. Guessing a number. Someone chose a number x between 1 and N. You are allowed to ask questions: Is this number larger
More information2.1. General Purpose Run Length Encoding Relative Encoding Tokanization or Pattern Substitution
2.1. General Purpose There are many popular general purpose lossless compression techniques, that can be applied to any type of data. 2.1.1. Run Length Encoding Run Length Encoding is a compression technique
More informationError Detection and Correction: Parity Check Code; Bounds Based on Hamming Distance
Error Detection and Correction: Parity Check Code; Bounds Based on Hamming Distance Greg Plaxton Theory in Programming Practice, Spring 2005 Department of Computer Science University of Texas at Austin
More informationModule 8: Video Coding Basics Lecture 40: Need for video coding, Elements of information theory, Lossless coding. The Lecture Contains:
The Lecture Contains: The Need for Video Coding Elements of a Video Coding System Elements of Information Theory Symbol Encoding Run-Length Encoding Entropy Encoding file:///d /...Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2040/40_1.htm[12/31/2015
More informationMAS160: Signals, Systems & Information for Media Technology. Problem Set 4. DUE: October 20, 2003
MAS160: Signals, Systems & Information for Media Technology Problem Set 4 DUE: October 20, 2003 Instructors: V. Michael Bove, Jr. and Rosalind Picard T.A. Jim McBride Problem 1: Simple Psychoacoustic Masking
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 1: Introduction & Overview Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Information Theory 1 / 26 Outline 1 Course Information 2 Course Overview
More informationOn the Capacity Region of the Vector Fading Broadcast Channel with no CSIT
On the Capacity Region of the Vector Fading Broadcast Channel with no CSIT Syed Ali Jafar University of California Irvine Irvine, CA 92697-2625 Email: syed@uciedu Andrea Goldsmith Stanford University Stanford,
More informationMonday, February 2, Is assigned today. Answers due by noon on Monday, February 9, 2015.
Monday, February 2, 2015 Topics for today Homework #1 Encoding checkers and chess positions Constructing variable-length codes Huffman codes Homework #1 Is assigned today. Answers due by noon on Monday,
More informationBasics of Error Correcting Codes
Basics of Error Correcting Codes Drawing from the book Information Theory, Inference, and Learning Algorithms Downloadable or purchasable: http://www.inference.phy.cam.ac.uk/mackay/itila/book.html CSE
More informationSyllabus. osmania university UNIT - I UNIT - II UNIT - III CHAPTER - 1 : INTRODUCTION TO DIGITAL COMMUNICATION CHAPTER - 3 : INFORMATION THEORY
i Syllabus osmania university UNIT - I CHAPTER - 1 : INTRODUCTION TO Elements of Digital Communication System, Comparison of Digital and Analog Communication Systems. CHAPTER - 2 : DIGITAL TRANSMISSION
More informationEC Talk. Asst. Prof. Dr. Prapun Suksompong.
EC Talk Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 1 Office Hours: (BKD 3601-7) Wednesday 9:30-11:30 Wednesday 16:00-17:00 Thursday 14:40-16:00 Outline Courses ECS 452: Digital Communication
More informationIntroduction to Error Control Coding
Introduction to Error Control Coding 1 Content 1. What Error Control Coding Is For 2. How Coding Can Be Achieved 3. Types of Coding 4. Types of Errors & Channels 5. Types of Codes 6. Types of Error Control
More informationUCSD ECE154C Handout #21 Prof. Young-Han Kim Thursday, April 28, Midterm Solutions (Prepared by TA Shouvik Ganguly)
UCSD ECE54C Handout #2 Prof. Young-Han Kim Thursday, April 28, 26 Midterm Solutions (Prepared by TA Shouvik Ganguly) There are 3 problems, each problem with multiple parts, each part worth points. Your
More informationThe Z Channel. Nihar Jindal Department of Electrical Engineering Stanford University, Stanford, CA
The Z Channel Sriram Vishwanath Dept. of Elec. and Computer Engg. Univ. of Texas at Austin, Austin, TX E-mail : sriram@ece.utexas.edu Nihar Jindal Department of Electrical Engineering Stanford University,
More information6. FUNDAMENTALS OF CHANNEL CODER
82 6. FUNDAMENTALS OF CHANNEL CODER 6.1 INTRODUCTION The digital information can be transmitted over the channel using different signaling schemes. The type of the signal scheme chosen mainly depends on
More informationEE 435/535: Error Correcting Codes Project 1, Fall 2009: Extended Hamming Code. 1 Introduction. 2 Extended Hamming Code: Encoding. 1.
EE 435/535: Error Correcting Codes Project 1, Fall 2009: Extended Hamming Code Project #1 is due on Tuesday, October 6, 2009, in class. You may turn the project report in early. Late projects are accepted
More informationBackground Dirty Paper Coding Codeword Binning Code construction Remaining problems. Information Hiding. Phil Regalia
Information Hiding Phil Regalia Department of Electrical Engineering and Computer Science Catholic University of America Washington, DC 20064 regalia@cua.edu Baltimore IEEE Signal Processing Society Chapter,
More informationModule 3 Greedy Strategy
Module 3 Greedy Strategy Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Introduction to Greedy Technique Main
More informationChannel Coding/Decoding. Hamming Method
Channel Coding/Decoding Hamming Method INFORMATION TRANSFER ACROSS CHANNELS Sent Received messages symbols messages source encoder Source coding Channel coding Channel Channel Source decoder decoding decoding
More informationPROBABILITY AND STATISTICS Vol. II - Information Theory and Communication - Tibor Nemetz INFORMATION THEORY AND COMMUNICATION
INFORMATION THEORY AND COMMUNICATION Tibor Nemetz Rényi Mathematical Institute, Hungarian Academy of Sciences, Budapest, Hungary Keywords: Shannon theory, alphabet, capacity, (transmission) channel, channel
More informationState-Dependent Relay Channel: Achievable Rate and Capacity of a Semideterministic Class
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 59, NO. 5, MAY 2013 2629 State-Dependent Relay Channel: Achievable Rate and Capacity of a Semideterministic Class Majid Nasiri Khormuji, Member, IEEE, Abbas
More informationDIGITAL COMMINICATIONS
Code No: R346 R Set No: III B.Tech. I Semester Regular and Supplementary Examinations, December - 23 DIGITAL COMMINICATIONS (Electronics and Communication Engineering) Time: 3 Hours Max Marks: 75 Answer
More information5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010
5984 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Interference Channels With Correlated Receiver Side Information Nan Liu, Member, IEEE, Deniz Gündüz, Member, IEEE, Andrea J.
More informationCommunications IB Paper 6 Handout 3: Digitisation and Digital Signals
Communications IB Paper 6 Handout 3: Digitisation and Digital Signals Jossy Sayir Signal Processing and Communications Lab Department of Engineering University of Cambridge jossy.sayir@eng.cam.ac.uk Lent
More informationCourse Developer: Ranjan Bose, IIT Delhi
Course Title: Coding Theory Course Developer: Ranjan Bose, IIT Delhi Part I Information Theory and Source Coding 1. Source Coding 1.1. Introduction to Information Theory 1.2. Uncertainty and Information
More informationInternational Journal of Computer Trends and Technology (IJCTT) Volume 40 Number 2 - October2016
Signal Power Consumption in Digital Communication using Convolutional Code with Compared to Un-Coded Madan Lal Saini #1, Dr. Vivek Kumar Sharma *2 # Ph. D. Scholar, Jagannath University, Jaipur * Professor,
More informationApplications of Probability Theory
Applications of Probability Theory The subject of probability can be traced back to the 17th century when it arose out of the study of gambling games. The range of applications extends beyond games into
More informationQUESTION BANK (VI SEM ECE) (DIGITAL COMMUNICATION)
QUESTION BANK (VI SEM ECE) (DIGITAL COMMUNICATION) UNIT-I: PCM & Delta modulation system Q.1 Explain the difference between cross talk & intersymbol interference. Q.2 What is Quantization error? How does
More informationSolutions to Assignment-2 MOOC-Information Theory
Solutions to Assignment-2 MOOC-Information Theory 1. Which of the following is a prefix-free code? a) 01, 10, 101, 00, 11 b) 0, 11, 01 c) 01, 10, 11, 00 Solution:- The codewords of (a) are not prefix-free
More informationS Coding Methods (5 cr) P. Prerequisites. Literature (1) Contents
S-72.3410 Introduction 1 S-72.3410 Introduction 3 S-72.3410 Coding Methods (5 cr) P Lectures: Mondays 9 12, room E110, and Wednesdays 9 12, hall S4 (on January 30th this lecture will be held in E111!)
More informationChannel Coding and Cryptography
Hochschule Wismar Channel Coding and Cryptography Baltic Summer School Technical Informatics & Information Technology (BaSoTi) Tartu (Estonia) July/August 2012 Prof. Dr.-Ing. habil. Andreas Ahrens Communications
More informationExercises to Chapter 2 solutions
Exercises to Chapter 2 solutions 1 Exercises to Chapter 2 solutions E2.1 The Manchester code was first used in Manchester Mark 1 computer at the University of Manchester in 1949 and is still used in low-speed
More informationCOPYRIGHTED MATERIAL. Introduction. 1.1 Communication Systems
1 Introduction The reliable transmission of information over noisy channels is one of the basic requirements of digital information and communication systems. Here, transmission is understood both as transmission
More informationMATHEMATICS IN COMMUNICATIONS: INTRODUCTION TO CODING. A Public Lecture to the Uganda Mathematics Society
Abstract MATHEMATICS IN COMMUNICATIONS: INTRODUCTION TO CODING A Public Lecture to the Uganda Mathematics Society F F Tusubira, PhD, MUIPE, MIEE, REng, CEng Mathematical theory and techniques play a vital
More informationECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013
ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013 Lecture 18 Today: (1) da Silva Discussion, (2) Error Correction Coding, (3) Error Detection (CRC) HW 8 due Tue. HW 9 (on Lectures
More informationLecture 4: Wireless Physical Layer: Channel Coding. Mythili Vutukuru CS 653 Spring 2014 Jan 16, Thursday
Lecture 4: Wireless Physical Layer: Channel Coding Mythili Vutukuru CS 653 Spring 2014 Jan 16, Thursday Channel Coding Modulated waveforms disrupted by signal propagation through wireless channel leads
More informationCoding Techniques and the Two-Access Channel
Coding Techniques and the Two-Access Channel A.J. Han VINCK Institute for Experimental Mathematics, University of Duisburg-Essen, Germany email: Vinck@exp-math.uni-essen.de Abstract. We consider some examples
More informationThe idea of similarity is through the Hamming
Hamming distance A good channel code is designed so that, if a few bit errors occur in transmission, the output can still be identified as the correct input. This is possible because although incorrect,
More informationECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013
ECE 5325/6325: Wireless Communication Systems Lecture Notes, Spring 2013 Lecture 18 Today: (1) da Silva Discussion, (2) Error Correction Coding, (3) Error Detection (CRC) HW 8 due Tue. HW 9 (on Lectures
More informationELEC3028 (EL334) Digital Transmission
ELEC3028 (EL334) Digital Transmission Half of the unit: Information Theory MODEM (modulator and demodulator) Professor Sheng Chen: Building 53, Room 4005 E-mail: sqc@ecs.soton.ac.uk Lecture notes from:
More informationDCSP-1: Introduction. Jianfeng Feng. Department of Computer Science Warwick Univ., UK
DCSP-1: Introduction Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html Time Monday (L) 14.00-15.00 CS 1.01 Tuesday
More informationEECS 473 Advanced Embedded Systems. Lecture 13 Start on Wireless
EECS 473 Advanced Embedded Systems Lecture 13 Start on Wireless Team status updates Losing track of who went last. Cyberspeaker VisibleLight Elevate Checkout SmartHaus Upcoming Last lecture this Thursday
More informationIntroduction to Coding Theory
Coding Theory Massoud Malek Introduction to Coding Theory Introduction. Coding theory originated with the advent of computers. Early computers were huge mechanical monsters whose reliability was low compared
More informationECEn 665: Antennas and Propagation for Wireless Communications 131. s(t) = A c [1 + αm(t)] cos (ω c t) (9.27)
ECEn 665: Antennas and Propagation for Wireless Communications 131 9. Modulation Modulation is a way to vary the amplitude and phase of a sinusoidal carrier waveform in order to transmit information. When
More informationENGR 4323/5323 Digital and Analog Communication
ENGR 4323/5323 Digital and Analog Communication Chapter 1 Introduction Engineering and Physics University of Central Oklahoma Dr. Mohamed Bingabr Course Materials Textbook: Modern Digital and Analog Communication,
More informationWednesday, February 1, 2017
Wednesday, February 1, 2017 Topics for today Encoding game positions Constructing variable-length codes Huffman codes Encoding Game positions Some programs that play two-player games (e.g., tic-tac-toe,
More informationMulticasting over Multiple-Access Networks
ing oding apacity onclusions ing Department of Electrical Engineering and omputer Sciences University of alifornia, Berkeley May 9, 2006 EE 228A Outline ing oding apacity onclusions 1 2 3 4 oding 5 apacity
More informationAnalysis of Multi-rate filters in Communication system by using interpolation and decimation, filters
Analysis of Multi-rate filters in Communication system by using interpolation and decimation, filters Vibhooti Sharma M.Tech, E.C.E. Lovely Professional University PHAGWARA Amanjot Singh (Assistant Professor)
More informationFundamentals of Digital Communication
Fundamentals of Digital Communication Network Infrastructures A.A. 2017/18 Digital communication system Analog Digital Input Signal Analog/ Digital Low Pass Filter Sampler Quantizer Source Encoder Channel
More informationPROJECT 5: DESIGNING A VOICE MODEM. Instructor: Amir Asif
PROJECT 5: DESIGNING A VOICE MODEM Instructor: Amir Asif CSE4214: Digital Communications (Fall 2012) Computer Science and Engineering, York University 1. PURPOSE In this laboratory project, you will design
More informationLossy Compression of Permutations
204 IEEE International Symposium on Information Theory Lossy Compression of Permutations Da Wang EECS Dept., MIT Cambridge, MA, USA Email: dawang@mit.edu Arya Mazumdar ECE Dept., Univ. of Minnesota Twin
More informationECE 6640 Digital Communications
ECE 6640 Digital Communications Dr. Bradley J. Bazuin Assistant Professor Department of Electrical and Computer Engineering College of Engineering and Applied Sciences Chapter 8 8. Channel Coding: Part
More informationThe ternary alphabet is used by alternate mark inversion modulation; successive ones in data are represented by alternating ±1.
Alphabets EE 387, Notes 2, Handout #3 Definition: An alphabet is a discrete (usually finite) set of symbols. Examples: B = {0,1} is the binary alphabet T = { 1,0,+1} is the ternary alphabet X = {00,01,...,FF}
More informationICE1495 Independent Study for Undergraduate Project (IUP) A. Lie Detector. Prof. : Hyunchul Park Student : Jonghun Park Due date : 06/04/04
ICE1495 Independent Study for Undergraduate Project (IUP) A Lie Detector Prof. : Hyunchul Park Student : 20020703 Jonghun Park Due date : 06/04/04 Contents ABSTRACT... 2 1. INTRODUCTION... 2 1.1 BASIC
More information