COMP 2804 solutions Assignment 4

Similar documents
Discrete Structures Lecture Permutations and Combinations

Chapter 1. Probability

Chapter 1. Probability

Balanced Trees. Balanced Trees Tree. 2-3 Tree. 2 Node. Binary search trees are not guaranteed to be balanced given random inserts and deletes

Theory of Probability - Brett Bernstein

SMT 2014 Advanced Topics Test Solutions February 15, 2014

6.042/18.062J Mathematics for Computer Science December 17, 2008 Tom Leighton and Marten van Dijk. Final Exam

The tenure game. The tenure game. Winning strategies for the tenure game. Winning condition for the tenure game

Section 6.1 #16. Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Topics to be covered

Chapter 2. Permutations and Combinations

COUNTING AND PROBABILITY


Due Friday February 17th before noon in the TA drop box, basement, AP&M. HOMEWORK 3 : HAND IN ONLY QUESTIONS: 2, 4, 8, 11, 13, 15, 21, 24, 27

CSE 21 Practice Final Exam Winter 2016

The next several lectures will be concerned with probability theory. We will aim to make sense of statements such as the following:

4.1 Sample Spaces and Events

Week 1: Probability models and counting

Combinatorics. Chapter Permutations. Counting Problems

Combinatorics and Intuitive Probability

Compound Probability. Set Theory. Basic Definitions

ACTIVITY 6.7 Selecting and Rearranging Things

Mathematical Foundations HW 5 By 11:59pm, 12 Dec, 2015

Foundations of Computing Discrete Mathematics Solutions to exercises for week 12

Counting in Algorithms

1 Permutations. 1.1 Example 1. Lisa Yan CS 109 Combinatorics. Lecture Notes #2 June 27, 2018

CSE 20 DISCRETE MATH. Fall

WEEK 7 REVIEW. Multiplication Principle (6.3) Combinations and Permutations (6.4) Experiments, Sample Spaces and Events (7.1)

Counting and Probability Math 2320

CSE 21 Mathematics for Algorithm and System Analysis

Math 454 Summer 2005 Due Wednesday 7/13/05 Homework #2. Counting problems:

Probability. Misha Lavrov. ARML Practice 5/5/2013

Important Distributions 7/17/2006

Name Class Date. Introducing Probability Distributions

Chapter 1. Set Theory

A Note on Downup Permutations and Increasing Trees DAVID CALLAN. Department of Statistics. Medical Science Center University Ave

Dependence. Math Circle. October 15, 2016

HOMEWORK ASSIGNMENT 5

PUZZLES ON GRAPHS: THE TOWERS OF HANOI, THE SPIN-OUT PUZZLE, AND THE COMBINATION PUZZLE

Honors Precalculus Chapter 9 Summary Basic Combinatorics

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

MATH 215 DISCRETE MATHEMATICS INSTRUCTOR: P. WENG

Reading 14 : Counting

CHAPTER 2 PROBABILITY. 2.1 Sample Space. 2.2 Events

The topic for the third and final major portion of the course is Probability. We will aim to make sense of statements such as the following:

Probability (Devore Chapter Two)

Intermediate Math Circles November 1, 2017 Probability I

CS 787: Advanced Algorithms Homework 1

Randomized Algorithms

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

ECS 20 (Spring 2013) Phillip Rogaway Lecture 1

Introduction to probability

NOTES ON SEPT 13-18, 2012

Tangent: Boromean Rings. The Beer Can Game. Plan. A Take-Away Game. Mathematical Games I. Introduction to Impartial Combinatorial Games

MA/CSSE 473 Day 14. Permutations wrap-up. Subset generation. (Horner s method) Permutations wrap up Generating subsets of a set

17. Symmetries. Thus, the example above corresponds to the matrix: We shall now look at how permutations relate to trees.

MAT3707. Tutorial letter 202/1/2017 DISCRETE MATHEMATICS: COMBINATORICS. Semester 1. Department of Mathematical Sciences MAT3707/202/1/2017

I. WHAT IS PROBABILITY?

CSE 312: Foundations of Computing II Quiz Section #2: Inclusion-Exclusion, Pigeonhole, Introduction to Probability (solutions)

CSE 312: Foundations of Computing II Quiz Section #2: Inclusion-Exclusion, Pigeonhole, Introduction to Probability

CS100: DISCRETE STRUCTURES. Lecture 8 Counting - CH6

The study of probability is concerned with the likelihood of events occurring. Many situations can be analyzed using a simplified model of probability

CSE 312 Midterm Exam May 7, 2014

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 11

Final Practice Problems: Dynamic Programming and Max Flow Problems (I) Dynamic Programming Practice Problems

Topic 1: defining games and strategies. SF2972: Game theory. Not allowed: Extensive form game: formal definition

Game Theory and Randomized Algorithms

DVA325 Formal Languages, Automata and Models of Computation (FABER)

The Pigeonhole Principle

Discrete Structures for Computer Science

RANDOM EXPERIMENTS AND EVENTS

Combinatorics. PIE and Binomial Coefficients. Misha Lavrov. ARML Practice 10/20/2013

MATHEMATICS 152, FALL 2004 METHODS OF DISCRETE MATHEMATICS Outline #10 (Sets and Probability)

Name: 1. Match the word with the definition (1 point each - no partial credit!)

Discrete probability and the laws of chance

Chapter 7: Sorting 7.1. Original

1. The chance of getting a flush in a 5-card poker hand is about 2 in 1000.

CALCULATING SQUARE ROOTS BY HAND By James D. Nickel

MATH 151, Section A01 Test 1 Version 1 February 14, 2013

November 6, Chapter 8: Probability: The Mathematics of Chance

STAT 430/510 Probability Lecture 3: Space and Event; Sample Spaces with Equally Likely Outcomes

Teacher s Notes. Problem of the Month: Courtney s Collection

7.1 Chance Surprises, 7.2 Predicting the Future in an Uncertain World, 7.4 Down for the Count

Week 3 Classical Probability, Part I

Lecture 1. Permutations and combinations, Pascal s triangle, learning to count

10-1. Combinations. Vocabulary. Lesson. Mental Math. able to compute the number of subsets of size r.

Math116Chapter15ProbabilityProbabilityDone.notebook January 08, 2012

Sec$on Summary. Permutations Combinations Combinatorial Proofs

Circular Nim Games. S. Heubach 1 M. Dufour 2. May 7, 2010 Math Colloquium, Cal Poly San Luis Obispo

Probability. Engr. Jeffrey T. Dellosa.

Contents 2.1 Basic Concepts of Probability Methods of Assigning Probabilities Principle of Counting - Permutation and Combination 39

Grade 7/8 Math Circles February 25/26, Probability

Section Summary. Permutations Combinations Combinatorial Proofs

9.5 Counting Subsets of a Set: Combinations. Answers for Test Yourself

3. Discrete Probability. CSE 312 Spring 2015 W.L. Ruzzo

#A13 INTEGERS 15 (2015) THE LOCATION OF THE FIRST ASCENT IN A 123-AVOIDING PERMUTATION

Harmonic numbers, Catalan s triangle and mesh patterns

Enumeration of Two Particular Sets of Minimal Permutations

STAT 3743: Probability and Statistics

On Symmetric Key Broadcast Encryption

Transcription:

COMP 804 solutions Assignment 4 Question 1: On the first page of your assignment, write your name and student number. Solution: Name: Lionel Messi Student number: 10 Question : Let n be an integer and consider two fixed integers a and b with 1 a < b n. Use the Product Rule to determine the number of permutations of {1,,..., n} in which a is to the left of b. Consider a uniformly random permutation of the set {1,,..., n}, and define the event A = in this permutation, a is to the left of b. Use your answer to the first part of this question to determine Pr(A. Solution: Choose positions ouf of n positions. There are ways to do this. Write a in the leftmost chosen position and write b in the rightmost chosen position. There is 1 way to do this. Write the n elements of {1,,..., n}\{a, b} in the remaining n positions. There are (n! ways to do this. By the Product Rule, the number of permutations of {1,,..., n} in which a is to the left of b is equal to ( n n(n 1 1 (n! = (n! = n!/. Out of the n! permutations, n!/ have a to the left of b. Therefore, Pr(A = n!/ n! = 1/. Question 3: I am sure you remember that Jennifer loves to drin India Pale Ale (IPA. Lindsay Bangs (President of the Carleton Computer Science Society, 014 015 prefers wheat beer. Jennifer and Lindsay decide to go to their favorite pub Chez Connor et Simon. The beer menu shows that this pub has ten beers on tap: 1

Five of these beers are of the IPA style. Three of these beers are of the wheat beer style. Two of these beers are of the pilsner style. Jennifer and Lindsay order a uniformly random subset of seven beers (thus, there are no duplicates. Define the following random variables: J = the number of IPAs in this order, L = the number of wheat beers in this order. Determine the expected value E(L of the random variable L. Show your wor. Are J and L independent random variables? Justify your answer. Solution: There are ( 10 7 = 10 ways to order 7 beers. The random variable L can tae the values 0, 1,, 3. There are ( 3 1( 7 6 = 1 ways to have exactly 1 wheat beer in the 7-beer order. There are ( 3 ( 7 5 = 63 ways to have exactly wheat beers in the 7-beer order. There are ( 3 3( 7 4 = 35 ways to have exactly 3 wheat beers in the 7-beer order. It follows that E(L = 0 Pr(L = 0 + 1 Pr(L = 1 + Pr(L = + 3 Pr(L = 3 = 1 0 + 1 10 + 63 10 + 3 35 10 = 1/10. Here is a second way to determine E(L: Number the wheat beers as 1,, 3. For i = 1,, 3, define the indicator random variable { 1 if wheat beer i is in the 7-beer order, L i = 0 otherwise. Then E(L i = Pr(L i = 1 = ( 9 6 ( 10 7 = 7/10. Since L = L 1 + L + L 3,

we get E(L = E(L 1 + L + L 3 = E(L 1 + E(L + E(L 3 = 7/10 + 7/10 + 7/10 = 1/10. Jennifer and Lindsay order 7 beers. Therefore, we have J + L 7. Thus, Pr(J = 5 and L = 3 = 0. We have seen above that Pr(L = 3 0. We also now that Pr(J = 5 0, because there is at least one way that the 7-beer order contains 5 IPAs. Alternatively, we have ( 5 Thus, Pr(J = 5 = ( 5 5 ( 10 7 0. Pr(J = 5 Pr(L = 3 0 and we conclude that the random variables J and L are not independent. Question 4: One of Jennifer and Thomas is chosen uniformly at random. The person who is chosen wins $100. Define the random variables J and T as follows: J = the amount that Jennifer wins and Prove that T = the amount that Thomas wins. E (max(j, T max (E(J, E(T. Solution: The random variable J can tae two values: 0 and 100. Therefore, E(J = 0 Pr(J = 0 + 100 Pr(J = 100 = 100 Pr(J = 100 = 100 Pr(Jennifer is chosen = 100 1/ = 50. In the same way, we get E(T = 50. Thus, max (E(J, E(T = max(50, 50 = 50. 3

Since exactly one of J and T is equal to 100, we now that max(j, T is equal to 100, no matter who is chosen. Therefore, E (max(j, T = E(100 = 100. Question 5: Let n 1 be an integer and consider a permutation a 1, a,..., a n of the set {1,,..., n}. We partition this permutation into increasing subsequences. For example, for n = 10, the permutation 3, 5, 8, 1,, 4, 10, 7, 6, 9 is partitioned into four increasing subsequences: (i 3, 5, 8, (ii 1,, 4, 10, (iii 7, and (iv 6, 9. Let a 1, a,..., a n be a uniformly random permutation of the set {1,,..., n}. Define the random variable X to be the number of increasing subsequences in the partition of this permutation. For the example above, we have X = 4. In this question, you will determine the expected value E(X of X in two different ways. For each i with 1 i n, let { 1 if an increasing subsequence starts at position i, X i = 0 otherwise. For the example above, we have X 1 = 1, X = 0, X 3 = 0, and X 8 = 1. Determine E (X 1. Solution: The random variable X 1 is equal to 1, because for any permutation, an increasing subsequence starts at position 1. Therefore, E(X 1 = 1. Let i be an integer with i n. Use the Product Rule to determine the number of permutations of {1,,..., n} for which X i = 1. Solution: We observe that X i = 1 if and only if a i 1 > a i. Thus, we have to count the permutations a 1, a,..., a n for which a i 1 > a i : Choose values out of 1,,..., n. There are ways to do this. Place the smaller of these values at position i, and place the larger of these values at position i 1. There is 1 way to do this. Place the remaining n values at the n remaining positions. There are (n! ways to do this. By the Product Rule, the number of permutations we get is equal to ( n 1 (n! = n(n 1 (n! = n!/. 4

Use these indicator random variables to determine E(X. Solution: Since X = X 1 + X + + X n, we have E(X = E(X 1 + X + + X n = E(X 1 + E(X + + E(X n. We have seen above that E(X 1 = 1. For i n, we have It follows that E(X i = Pr(X i = 1 = n!/ n! = 1/. E(X = 1 + (n 1 1/ = (n + 1/. For each i with 1 i n, let { 1 if the value i is the leftmost element of an increasing subsequence, Y i = 0 otherwise. For the example above, we have Y 1 = 1, Y 3 = 1, Y 5 = 0, and Y 7 = 1. Determine E (Y 1. Solution: The random variable Y 1 is equal to 1, because for any permutation, an increasing subsequence starts at the value 1. Therefore, E(Y 1 = 1. Let i be an integer with i n. Use the Product Rule to determine the number of permutations of {1,,..., n} for which Y i = 1. Solution: If Y i = 1, then there are possibilities: The value i is at the first position of the permutation. There are (n 1! such permutations. The value i is at one of the positions, 3,..., n of the permutation. In this case, the value immediately to the left of i is larger than i. How many such permutations are there: Choose one of the positions, 3,..., n, and place the value i at that position. There are n 1 ways to do this. Choose a value in {i + 1, i +,..., n} and place it immediately to the left of i. There are n i ways to do this. Place the remaining n values at the n remaining positions. There are (n! ways to do this. By the Product Rule, the number of permutations we get is equal to (n 1 (n i (n! = (n i (n 1!. 5

Putting everything together, the total number of permutations for which Y i = 1 is equal to (n 1! + (n i (n 1! = (n i + 1 (n 1!. Use these indicator random variables to determine E(X. Solution: Since X = Y 1 + Y + + Y n, we have E(X = E(Y 1 + Y + + Y n = E(Y 1 + E(Y + + E(Y n. We have seen above that E(Y 1 = 1. For i n, we have It follows that E(Y i = Pr(Y i = 1 = (n i + 1 (n 1! n! n n i + 1 E(X = 1 + n i= = 1 + 1 n (n i + 1 n i= = n i + 1. n = 1 + 1 ((n 1 + (n + + 1 n = 1 + 1 (n 1n n = 1 + n 1 = n + 1. Question 6: Let n 1 be an integer, let p be a real number with 0 < p < 1, and let X be a random variable that has a binomial distribution with parameters n and p. In class, we have seen that the expected value E(X of X satisfies n ( n E(X = p (1 p n. (1 In class, we have also seen Newton s Binomial Theorem: n ( n (x + y n = x n y. =1 =0 6

Use (1 to prove that E(X = pn, by taing the derivative, with respect to y, in Newton s Binomial Theorem. Solution: If we differentiate Newton with respect to y, we get n ( n n(x + y n 1 = x n y 1 =0 n ( n = x n y 1 =1 = 1 n ( n x n y. y It follows that =1 n ( n x n y = yn(x + y n 1. =1 By taing x = 1 p and y = p, we get E(X = n ( n p (1 p n =1 = pn((1 p + p n 1 = pn. Question 7: Consider the following recursive algorithm TwoTails, which taes as input a positive integer : Algorithm TwoTails(: // all coin flips made are mutually independent flip a fair coin twice; if the coin came up tails exactly twice then return else TwoTails( + 1 endif You run algorithm TwoTails(1, i.e., with = 1. Define the random variable X to be the value of the output of this algorithm. Let 1 be an integer. Determine Pr ( X =. Is the expected value E(X of the random variable X finite or infinite? Justify your answer. 7

Solution: We flip a fair coin twice and say that we have a success (S if both coin flips result in tails. Otherwise, we have a failure (F. We have Pr(S = 1/4 and Pr(F = 3/4. We run TwoTails(1; let us see what can happen: If we have a success, then the algorithm returns. If we have a failure, then we run TwoTails(. If we have a success, then the algorithm returns 4. If we have a failure, then we run TwoTails(3. If we have a success, then the algorithm returns 8. If we have a failure, then we run TwoTails(4. If we have a success, then the algorithm returns 16. If we have a failure, then we run TwoTails(5. You will see the pattern: Therefore, X = if and only if there are 1 failures followed by 1 success. Pr ( X = = Pr ( F 1 S = (Pr(F 1 Pr(S = (3/4 1 1/4 = 3 1 /4. To determine the expected value of X, we notice that X can tae any value in the infinite set { 1,, 3, 4,...}. Therefore, E(X = Pr ( X = = = = 1 =1 3 1 /4 =1 3 1 / =1 (3/ =0 1 = lim N = lim N =. 8 1 N (3/ =0 (3/ N+1 1 3/ 1

Question 8: Let n be power of two and consider a full binary tree with n leaves. Let a 1, a,..., a n be a random permutation of the numbers 1,,..., n. Store this permutation at the leaves of the tree, in the order a 1, a,..., a n from left to right. For example, if n = 8 and the permutation is, 8, 1, 4, 6, 3, 5, 7, then we obtain the following tree: 8 1 4 6 3 5 7 Perform the following process on the tree: Visit the levels of the tree from bottom to top. At each level, tae all pairs of consecutive nodes that have the same parent. For each such pair, compare the numbers stored at the two nodes, and store the smaller of these two numbers at the common parent. For our example tree, we obtain the following tree: 1 1 3 1 3 5 8 1 4 6 3 5 7 It is clear that at the end of this process, the root stores the number 1. Define the random variable X to be the number that is not equal to 1 and that is stored at a child of the root. For our example tree, X = 3. In the following questions, you will determine the expected value E(X of the random variable X. Prove that X 1 + n/. Solution: Since X 1, it is clear that X. The root has two subtrees; tae the subtree that does not store the number 1. Then the value of X is the smallest number that is stored in this subtree. This subtree stores n/ numbers. If X + n/, then this subtree can only store at most n/ 1 numbers. Therefore, X 1 + n/. 9

Prove that the following is true for each with 1 n/: X + 1 if and only if all numbers 1,,..., are stored in the left subtree of the root or all numbers 1,,..., are stored in the right subtree of the root. Solution: First assume that X + 1. Consider again the subtree of the root that does not store the number 1. Since X is the smallest number in this subtree, all numbers in this subtree are at least + 1. It follows that all numbers 1,,..., are together in one subtree of the root. Assume that, say, all number 1,,..., are stored in the left subtree of the root. Then the smallest number in the right subtree must be at least + 1. Thus, X + 1. Prove that for each with 1 n/, Pr(X + 1 = /!(n! n! = /. Solution: Let N be the number of permutations of 1,,..., n such that all numbers 1,,..., are stored in the left subtree of the root. Then, using the previous part of this question, Pr(X + 1 = N n!. To determine N, we use the Product Rule: Choose leaves out of the n/ leaves in the left subtree of the root. There are ways to do this. / Write the numbers 1,,..., at the chosen leaves. There are! ways to do this. Write the numbers + 1, +,..., n at the remaining n leaves. There are (n! ways to do this. We conclude that and Since it follows that ( n/ N =!(n! Pr(X + 1 = ( n/!(n!. n! ( n = n!!(n!, Pr(X + 1 = /. 10

According to Exercise 6.10 in the textboo, we have Prove that Solution: We have E(X = E(X = Pr(X. =1 n/ E(X = Pr(X 1 + Pr(X + 1. Pr(X =1 =1 n/ = Pr(X 1 + Pr(X + 1 + =1 From the first part of the question: If 1 + n/, then Therefore, Pr(X + 1 = 0. =1+n/ n/ E(X = Pr(X 1 + Pr(X + 1. Use Question 8 in Assignment 1 to prove that =1 E(X = 3 4 n +. Pr(X + 1. Solution: According to Question 8 in Assignment 1, we have ( m m ( n = n + 1 n + 1 m. =0 For m = n/, we get n/ =0 / = n + 1 n + 1 n/ = n + 1 n/ + 1 = n + n +. 11

Since it follows that We have seen above that n/ =0 / = n/ =1 / ( n / 0 = 1 + 0 + n/ =1 n/ ( n/ =1 /, = n + n + 1. Since X is always at least, we have Thus, we get n/ E(X = Pr(X 1 + Pr(X + 1. =1 Pr(X 1 = 1. n/ E(X = 1 + Pr(X + 1 =1 n/ = 1 + =1 n/ = 1 + = 1 + =1 / / ( n + n + 1 = 1 + n + n + (n + = 1 + ( n + = 1 + n + = 3 4 n +. 1