ECE 499/599 Data Compression/Information Theory Spring 06. Dr. Thinh Nguyen. Homework 2 Due 04/27/06 at the beginning of the class

Similar documents
Section Summary. Finite Probability Probabilities of Complements and Unions of Events Probabilistic Reasoning

CS1802 Week 9: Probability, Expectation, Entropy

Probability Paradoxes

Expected Value, continued

or More Events Activities D2.1 Open and Shut Case D2.2 Fruit Machines D2.3 Birthdays Notes for Solutions (1 page)

The study of probability is concerned with the likelihood of events occurring. Many situations can be analyzed using a simplified model of probability

LISTING THE WAYS. getting a total of 7 spots? possible ways for 2 dice to fall: then you win. But if you roll. 1 q 1 w 1 e 1 r 1 t 1 y

The Teachers Circle Mar. 20, 2012 HOW TO GAMBLE IF YOU MUST (I ll bet you $5 that if you give me $10, I ll give you $20.)

Multiplication and Probability

DISCUSSION #8 FRIDAY MAY 25 TH Sophie Engle (Teacher Assistant) ECS20: Discrete Mathematics

STATION 1: ROULETTE. Name of Guesser Tally of Wins Tally of Losses # of Wins #1 #2

Discrete Structures for Computer Science

Casino Lab AP Statistics

The topic for the third and final major portion of the course is Probability. We will aim to make sense of statements such as the following:

Presentation by Toy Designers: Max Ashley

Section 7.3 and 7.4 Probability of Independent Events

COMM901 Source Coding and Compression Winter Semester 2013/2014. Midterm Exam

Outcome X (1, 1) 2 (2, 1) 3 (3, 1) 4 (4, 1) 5 {(1, 1) (1, 2) (1, 3) (1, 4) (1, 5) (1, 6) (6, 1) (6, 2) (6, 3) (6, 4) (6, 5) (6, 6)}

Lesson 1: Chance Experiments

Information Theory and Huffman Coding

Math116Chapter15ProbabilityProbabilityDone.notebook January 08, 2012

Math Steven Noble. November 24th. Steven Noble Math 3790

Today. Nondeterministic games: backgammon. Algorithm for nondeterministic games. Nondeterministic games in general. See Russell and Norvig, chapter 6

CSC/MTH 231 Discrete Structures II Spring, Homework 5

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 13

GCSE MATHEMATICS Intermediate Tier, topic sheet. PROBABILITY

La Gran Aventura. La Gran Aventura. Reels - 5 Wins are counted from left to right. Main Screen

Unit 6: What Do You Expect? Investigation 2: Experimental and Theoretical Probability

The next several lectures will be concerned with probability theory. We will aim to make sense of statements such as the following:

D1 Probability of One Event

Math 147 Lecture Notes: Lecture 21

Lecture 21/Chapter 18 When Intuition Differs from Relative Frequency

Lecture5: Lossless Compression Techniques

Random Variables. A Random Variable is a rule that assigns a number to each outcome of an experiment.

CS 371M. Homework 2: Risk. All submissions should be done via git. Refer to the git setup, and submission documents for the correct procedure.

Math1116Chapter15ProbabilityProbabilityDone.notebook January 20, 2013

ALGEBRA 2 HONORS QUADRATIC FUNCTIONS TOURNAMENT REVIEW

Random Variables. Outcome X (1, 1) 2 (2, 1) 3 (3, 1) 4 (4, 1) 5. (6, 1) (6, 2) (6, 3) (6, 4) (6, 5) (6, 6) }

There is no class tomorrow! Have a good weekend! Scores will be posted in Compass early Friday morning J

NUMB3RS Activity: A Bit of Basic Blackjack. Episode: Double Down

Specification Due Date: Friday April 7 at 6am Top-down Program Outline Due Date: Wednesday April 19 at 6am Program Due Date: Monday May 15 at 6am

Section The Multiplication Principle and Permutations

Probability MAT230. Fall Discrete Mathematics. MAT230 (Discrete Math) Probability Fall / 37

1. The chance of getting a flush in a 5-card poker hand is about 2 in 1000.

1. More on Binomial Distributions

MAS.160 / MAS.510 / MAS.511 Signals, Systems and Information for Media Technology Fall 2007

Probability. March 06, J. Boulton MDM 4U1. P(A) = n(a) n(s) Introductory Probability

Discrete Random Variables Day 1

Contents 2.1 Basic Concepts of Probability Methods of Assigning Probabilities Principle of Counting - Permutation and Combination 39

A).4,.4,.2 B).4,.6,.4 C).3,.3,.3 D).5,.3, -.2 E) None of these are legitimate

Module 3 Greedy Strategy

Mathematical Foundations HW 5 By 11:59pm, 12 Dec, 2015

Math 3338: Probability (Fall 2006)

1) = 10) 4-15 = 2) (-4)(-3) = 11) = 4) -9 6 = 13) = 5) = 14) (-3)(15) = = 15) 7) = 16) -7 (-18) =

Bouncy Dice Explosion

Probability and the Monty Hall Problem Rong Huang January 10, 2016

Probability Homework Pack 1

Problem Set 2. Counting

Math 100, Writing Assignment #2

Fall 2017 March 13, Written Homework 4

Alg 2/Trig Honors Qtr 3 Review

COMPOUND EVENTS. Judo Math Inc.

Math 12 Academic Assignment 9: Probability Outcomes: B8, G1, G2, G3, G4, G7, G8

Junior Circle Meeting 5 Probability. May 2, ii. In an actual experiment, can one get a different number of heads when flipping a coin 100 times?

Venn Diagram Problems

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM.

Table Games Rules. MargaritavilleBossierCity.com FIN CITY GAMBLING PROBLEM? CALL

MAS160: Signals, Systems & Information for Media Technology. Problem Set 4. DUE: October 20, 2003

23 Applications of Probability to Combinatorics

New Values for Top Entails

2. The Extensive Form of a Game

Would You Like To Earn $1000 s With The Click Of A Button?

Chapter 17: The Expected Value and Standard Error

Introduction to Game Theory a Discovery Approach. Jennifer Firkins Nordstrom

Part 1: I can express probability as a fraction, decimal, and percent

The Lempel-Ziv (LZ) lossless compression algorithm was developed by Jacob Ziv (AT&T Bell Labs / Technion Israel) and Abraham Lempel (IBM) in 1978;

We all know what it means for something to be random. Or do

Rosen, Discrete Mathematics and Its Applications, 6th edition Extra Examples

Chance and risk play a role in everyone s life. No

LECTURE VI: LOSSLESS COMPRESSION ALGORITHMS DR. OUIEM BCHIR

TD-Gammon, a Self-Teaching Backgammon Program, Achieves Master-Level Play

Page 1 of 22. Website: Mobile:

Mini-Lecture 6.1 Discrete Random Variables

If event A is more likely than event B, then the probability of event A is higher than the probability of event B.

Lenarz Math 102 Practice Exam # 3 Name: 1. A 10-sided die is rolled 100 times with the following results:

Functional Skills Mathematics

Cliff Hoopin. N a t h a n a e l S m i t h

2. Combinatorics: the systematic study of counting. The Basic Principle of Counting (BPC)

* How many total outcomes are there if you are rolling two dice? (this is assuming that the dice are different, i.e. 1, 6 isn t the same as a 6, 1)

Creating Interactive Games in a Flash! Candace R. Black

Probability. A Mathematical Model of Randomness

Probability (Devore Chapter Two)

CHAPTER 5 PAPR REDUCTION USING HUFFMAN AND ADAPTIVE HUFFMAN CODES

Would You Like To Earn $1000 s With The Click Of A Button?

The probability set-up

Probability A = {(1,4), (2,3), (3,2), (4,1)},

(a) Suppose you flip a coin and roll a die. Are the events obtain a head and roll a 5 dependent or independent events?

Supreme Hot Video Slot. Introduction. How to Bet. Gamble Feature

Independent Events B R Y

Math 10 Homework 2 ANSWER KEY. Name: Lecturer: Instructions

Transcription:

ECE 499/599 Data Compression/Information Theory Spring 06 Dr. Thinh Nguyen Homework 2 Due 04/27/06 at the beginning of the class

Problem 2: Suppose you are given a task of compressing a Klingon text consisting of 10,000 letters. To your surprise, the Klingon alphabet consists of only five different letters: k1, k2, k3, k4, and k5. Furthermore, you notice that there are 4000 k1 s, 3000 k2 s, 2000 k3 s, 500 k4 s and 500 k5 s. (6pts) (a) Develop the Huffman codes for these Klingon letters. k1 0.4 0 k1:0 k2 k3 k4 k5 0.3 0.2.05.05 0.1 0.3 0.6 1 k2:10 k3:110 k4:1110 k5:1111 Note: There are also other valid Huffman codes! (b) What is the average code rate? 2 bits/ symbol. (c) Interestingly, you notice that the letters k4 and k5 always appear together as k4k5. Can you use this information to improve the compression ratio? If so, what are the new code and corresponding average coding rate? k1 k2 k3 k4k5 40/95 30/95 20/95 5/95 25/95 55/95 0 1 k1:0 k2:10 k3:110 k4k5:111 Code rate = 1.8421 bits. Problem 3: By now, you are an expert at compressing alien s languages, the U.S. government assigns you the task of compressing Krypton language. Kryptonite are very smart (or stupid) beings, as such their entire language can be represented by sequences consisting of only three letters k1, k2, and k3. Being a compression expert that you are, you examine the

krypton text, and notice there are 7000 k1 s, 2000 k2 s, and 1000 k3 s out of the given Krypton text containing 10,000 letters. (6pts) (a) What is the entropy rate of this Krypton text? 1.157 bits (b) If Huffman code is used to code the letters, what is the average code rate? (.7)(1) + 0.2(2) + 0.1(2) = 1.3 bits. (c) Due to the bandwidth limitation of inter-star communication, the US government wants you to improve your basic Huffman compression scheme. Can you modify Huffman coding to compress the Krypton text further? If so, provide an example of the new codes. Yes, use extended Huffman code. Combine the two symbols together. An example of the new code looks like the following: k1k1 1 k1k2 000 k2k1 001 k1k3 0100 k3k1 0101 k2k2 0111 k2k3 01101 k3k2 011000 k3k3 011001 Average code rate: = 1.165 bit per symbols. Note that: There are also other huffman codes. Problem 4: (bonus) Suppose you are at Las Vegas, and you notice this one peculiar game. The game is described as follows. A guy throws a peculiar dice with three faces: 1, 2, and 3. However, he would not let you know the outcome of the dice. Your job is to guess the outcome of the dice. Suppose you guess an outcome, then he would tell you which number is not the outcome, and gives you a second chance to guess your number again (he won t throw the dice again). If you guess wrong the second time, you have to give him $5, on other hand if you guess right, he has to give you $4. (2pts) (a) What is the amount of uncertainty regarding your guesses? Scenario A: Many of you assume that the guys always tell you the wrong number which is not the one you picked. Assume this, then the chances of you getting the right and wrong number in the first guess are 2/3 and 1/3, respectively. Assume you guess right the first time, then you when you switch on your second guess, you will be wrong with probability 1, or right with probability 0. Assume you guess wrong the first time, then when you switch on the second guess, you will guess right with probability 1. So if you always switch after the second guess then your probability of getting it right is:

1/3(0) + (2/3)(1)= 2/3. See the balls in the figure below. If you stay with your first guess, your chance of getting it right is 1-2/3 = 1/3. Hence, the uncertainty regarding your guesses is: H(1/3,2/3) = 0.9 Scenario B: Many of you also assume that the guy will tell you the wrong number regardless of whether that number is the one you picked or not. If you choose a wrong number and if with probability 1/2 he will tell you that your number is wrong (and with probability 1/2, he will tell you the other number is wrong) then, the chance of you getting it right when you switch is: (1/3)(0) + (2/3)[(1/2)(1) + (1/2)(1/2)] = 1/2 Hence, H(1/2, 1/2) = 1 bits. On the other hand, if he is naïve enough to think that if he always tell you that your picked number is always wrong (unless you pick the right number), then your chance of getting it right when you switch is: (1/3)(0) + (2/3)(1/2) = 1/3. And hence he will gain. Well he is wrong, if you know he is using this strategy, you should not switch since your chance of getting it right is 2/3. (b) Would you play the game (I assume you like to win some money)? Yes. Assume Scenario A: The average amount of money you would win is 2/3x4 1/3x5 = $1. No. Assume Scenario B: You lose $1 per game on average.