Probability Theory. POLI Mathematical and Statistical Foundations. Sebastian M. Saiegh

Similar documents
Probability and Statistics. Copyright Cengage Learning. All rights reserved.

Grade 6 Math Circles Fall Oct 14/15 Probability

STOR 155 Introductory Statistics. Lecture 10: Randomness and Probability Model

Foundations of Probability Worksheet Pascal

STAT 155 Introductory Statistics. Lecture 11: Randomness and Probability Model

Math 146 Statistics for the Health Sciences Additional Exercises on Chapter 3

Suppose Y is a random variable with probability distribution function f(y). The mathematical expectation, or expected value, E(Y) is defined as:

Probability: Part 1 1/28/16

1. The chance of getting a flush in a 5-card poker hand is about 2 in 1000.

RANDOM EXPERIMENTS AND EVENTS

The topic for the third and final major portion of the course is Probability. We will aim to make sense of statements such as the following:

The next several lectures will be concerned with probability theory. We will aim to make sense of statements such as the following:

Probability with Set Operations. MATH 107: Finite Mathematics University of Louisville. March 17, Complicated Probability, 17th century style

Combinatorics: The Fine Art of Counting

Probability: Terminology and Examples Spring January 1, / 22

Probability (Devore Chapter Two)

Lesson 10: Using Simulation to Estimate a Probability

Chapter 1. Probability

CHAPTER 7 Probability

EECS 203 Spring 2016 Lecture 15 Page 1 of 6

FALL 2012 MATH 1324 REVIEW EXAM 4

Week 1: Probability models and counting

CHAPTER 2 PROBABILITY. 2.1 Sample Space. 2.2 Events

Probability. Dr. Zhang Fordham Univ.

Chapter 8: Probability: The Mathematics of Chance

Chapter 1. Probability

Class XII Chapter 13 Probability Maths. Exercise 13.1

Probability MAT230. Fall Discrete Mathematics. MAT230 (Discrete Math) Probability Fall / 37

1) What is the total area under the curve? 1) 2) What is the mean of the distribution? 2)

Chapter 5 - Elementary Probability Theory

CSC/MTH 231 Discrete Structures II Spring, Homework 5

CS 361: Probability & Statistics

4.2.4 What if both events happen?

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 13

Introduction to probability

Lecture Start

heads 1/2 1/6 roll a die sum on 2 dice 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 1, 2, 3, 4, 5, 6 heads tails 3/36 = 1/12 toss a coin trial: an occurrence

CSC/MATA67 Tutorial, Week 12

Math 147 Elementary Probability/Statistics I Additional Exercises on Chapter 4: Probability

Probability Exercise 2

Exercise Class XI Chapter 16 Probability Maths

Name: Class: Date: Probability/Counting Multiple Choice Pre-Test

I. WHAT IS PROBABILITY?

Statistics 1040 Summer 2009 Exam III

STAT 430/510 Probability Lecture 3: Space and Event; Sample Spaces with Equally Likely Outcomes

UNIT 4 APPLICATIONS OF PROBABILITY Lesson 1: Events. Instruction. Guided Practice Example 1

= = 0.1%. On the other hand, if there are three winning tickets, then the probability of winning one of these winning tickets must be 3 (1)

Math 1313 Section 6.2 Definition of Probability

12 Probability. Introduction Randomness

Lecture 5, MATH 210G.02, Fall (Modern) History of Probability

Lecture 5, MATH 210G.M03, Fall (Modern) History of Probability

Section Introduction to Sets

Probability Assignment

Mathematical Foundations HW 5 By 11:59pm, 12 Dec, 2015

Page 1 of 22. Website: Mobile:

COUNTING AND PROBABILITY

Diamond ( ) (Black coloured) (Black coloured) (Red coloured) ILLUSTRATIVE EXAMPLES

The Teachers Circle Mar. 20, 2012 HOW TO GAMBLE IF YOU MUST (I ll bet you $5 that if you give me $10, I ll give you $20.)

Before giving a formal definition of probability, we explain some terms related to probability.

Such a description is the basis for a probability model. Here is the basic vocabulary we use.

Probability of Independent and Dependent Events

Reviving Pascal s and Huygens s Game Theoretic Foundation for Probability. Glenn Shafer, Rutgers University

Contents 2.1 Basic Concepts of Probability Methods of Assigning Probabilities Principle of Counting - Permutation and Combination 39

November 11, Chapter 8: Probability: The Mathematics of Chance

8.2 Union, Intersection, and Complement of Events; Odds

MATHEMATICS E-102, FALL 2005 SETS, COUNTING, AND PROBABILITY Outline #1 (Probability, Intuition, and Axioms)

Honors Precalculus Chapter 9 Summary Basic Combinatorics

CHAPTERS 14 & 15 PROBABILITY STAT 203

MATH CALCULUS & STATISTICS/BUSN - PRACTICE EXAM #1 - SPRING DR. DAVID BRIDGE

The probability set-up

Basic Probability Ideas. Experiment - a situation involving chance or probability that leads to results called outcomes.

Statistics Intermediate Probability

November 6, Chapter 8: Probability: The Mathematics of Chance

In how many ways can the letters of SEA be arranged? In how many ways can the letters of SEE be arranged?

Probability Rules. 2) The probability, P, of any event ranges from which of the following?

Section 6.1 #16. Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Probability. Ms. Weinstein Probability & Statistics

Georgia Department of Education Georgia Standards of Excellence Framework GSE Geometry Unit 6

Probability. A Mathematical Model of Randomness

Chapter 1: Sets and Probability

STAT Chapter 14 From Randomness to Probability

Applications of Probability Theory

PROBABILITY M.K. HOME TUITION. Mathematics Revision Guides. Level: GCSE Foundation Tier

4.1 Sample Spaces and Events

Chapter 7 Homework Problems. 1. If a carefully made die is rolled once, it is reasonable to assign probability 1/6 to each of the six faces.

Beginnings of Probability I

3. a. P(white) =, or. b. ; the probability of choosing a white block. d. P(white) =, or. 4. a. = 1 b. 0 c. = 0

Math 147 Lecture Notes: Lecture 21

2 A fair coin is flipped 8 times. What is the probability of getting more heads than tails? A. 1 2 B E. NOTA

Probability Models. Section 6.2

Probabilities and Probability Distributions

Compound Probability. Set Theory. Basic Definitions

ECON 214 Elements of Statistics for Economists

7.1 Experiments, Sample Spaces, and Events

The probability set-up

Is everything stochastic?

Chapter 4: Introduction to Probability

NumberSense Companion Workbook Grade 4

Lesson 4: Chapter 4 Sections 1-2

23 Applications of Probability to Combinatorics

Transcription:

POLI 270 - Mathematical and Statistical Foundations Department of Political Science University California, San Diego November 11, 2010

Introduction to 1 Probability Some Background 2 3 Conditional and Compound Probabilities Bayes Theorem Independent Events

Probability Some Background 1 Probability Some Background 2 3 Conditional and Compound Probabilities Bayes Theorem Independent Events

Lady Luck Probability Some Background The radical concept of replacing randomness with systematic probability and its implicit suggestion that the future might be predictable and even controllable to some degree, is what separates the modern times and the past. Yet, probability theory was not developed until the XVII century. And, it was a result of the efforts to analyze games of chance!

Lady Luck Probability Some Background The radical concept of replacing randomness with systematic probability and its implicit suggestion that the future might be predictable and even controllable to some degree, is what separates the modern times and the past. Yet, probability theory was not developed until the XVII century. And, it was a result of the efforts to analyze games of chance!

Lady Luck Probability Some Background The radical concept of replacing randomness with systematic probability and its implicit suggestion that the future might be predictable and even controllable to some degree, is what separates the modern times and the past. Yet, probability theory was not developed until the XVII century. And, it was a result of the efforts to analyze games of chance!

Lady Luck (cont.) Probability Some Background The game of craps provides a useful illustration of the relationship between gambling and the laws of probability: Throwing a pair of six-sided dice will produce, not eleven (from two to twelve), but thirty-six possible combinations, all the way from snake eyes (two ones) to box cars (double six). There is only one way for each of double-one and double-six to appear. While seven, the key number in craps, is the easiest to throw. There are six different ways to arrive at seven. The probability of a seven-throw is 6 36 or 16.66%.

Lady Luck (cont.) Probability Some Background The game of craps provides a useful illustration of the relationship between gambling and the laws of probability: Throwing a pair of six-sided dice will produce, not eleven (from two to twelve), but thirty-six possible combinations, all the way from snake eyes (two ones) to box cars (double six). There is only one way for each of double-one and double-six to appear. While seven, the key number in craps, is the easiest to throw. There are six different ways to arrive at seven. The probability of a seven-throw is 6 36 or 16.66%.

Lady Luck (cont.) Probability Some Background The game of craps provides a useful illustration of the relationship between gambling and the laws of probability: Throwing a pair of six-sided dice will produce, not eleven (from two to twelve), but thirty-six possible combinations, all the way from snake eyes (two ones) to box cars (double six). There is only one way for each of double-one and double-six to appear. While seven, the key number in craps, is the easiest to throw. There are six different ways to arrive at seven. The probability of a seven-throw is 6 36 or 16.66%.

Lady Luck (cont.) Probability Some Background The probability of an outcome is the ratio of favorable outcomes to the total opportunity set. The odds on an outcome are the ratio of favorable outcomes to unfavorable outcomes.

Lady Luck (cont.) Probability Some Background The probability of an outcome is the ratio of favorable outcomes to the total opportunity set. The odds on an outcome are the ratio of favorable outcomes to unfavorable outcomes.

Lady Luck (cont.) Probability Some Background The odds obviously depend on the probability, but the odds are what matter when you are placing a bet: If the probability of throwing a 7 in craps is one out of six throws, the odds on throwing a number other than 7 are 5 to 1. This means that you should bet no more than $1 that 7 will come up the next throw when the other guy bets $5 that it will not. Games of chance must be distinguished from games in which skill makes a difference. The odds are all you know for betting in a game of chance.

Lady Luck (cont.) Probability Some Background The odds obviously depend on the probability, but the odds are what matter when you are placing a bet: If the probability of throwing a 7 in craps is one out of six throws, the odds on throwing a number other than 7 are 5 to 1. This means that you should bet no more than $1 that 7 will come up the next throw when the other guy bets $5 that it will not. Games of chance must be distinguished from games in which skill makes a difference. The odds are all you know for betting in a game of chance.

Lady Luck (cont.) Probability Some Background The odds obviously depend on the probability, but the odds are what matter when you are placing a bet: If the probability of throwing a 7 in craps is one out of six throws, the odds on throwing a number other than 7 are 5 to 1. This means that you should bet no more than $1 that 7 will come up the next throw when the other guy bets $5 that it will not. Games of chance must be distinguished from games in which skill makes a difference. The odds are all you know for betting in a game of chance.

Lady Luck (cont.) Probability Some Background The odds obviously depend on the probability, but the odds are what matter when you are placing a bet: If the probability of throwing a 7 in craps is one out of six throws, the odds on throwing a number other than 7 are 5 to 1. This means that you should bet no more than $1 that 7 will come up the next throw when the other guy bets $5 that it will not. Games of chance must be distinguished from games in which skill makes a difference. The odds are all you know for betting in a game of chance.

Gambling in the Dark Some Background Gambling has been a popular pastime and often an addiction since the beginning of recorded history. However, until the XVII century, people had wagered and played games of chance without using any system of odds to determine winnings and losings. The Greeks understood that more things might happen in the future than actually will happen. In fact, the word, εικoς, meaning plausible or probable, had the same sense as the modern concept of probability: to be expected with some degree of certainty. However, they never made any advances in measuring this degree of certainty.

Gambling in the Dark Some Background Gambling has been a popular pastime and often an addiction since the beginning of recorded history. However, until the XVII century, people had wagered and played games of chance without using any system of odds to determine winnings and losings. The Greeks understood that more things might happen in the future than actually will happen. In fact, the word, εικoς, meaning plausible or probable, had the same sense as the modern concept of probability: to be expected with some degree of certainty. However, they never made any advances in measuring this degree of certainty.

Gambling in the Dark Some Background Gambling has been a popular pastime and often an addiction since the beginning of recorded history. However, until the XVII century, people had wagered and played games of chance without using any system of odds to determine winnings and losings. The Greeks understood that more things might happen in the future than actually will happen. In fact, the word, εικoς, meaning plausible or probable, had the same sense as the modern concept of probability: to be expected with some degree of certainty. However, they never made any advances in measuring this degree of certainty.

Gambling in the Dark Some Background Gambling has been a popular pastime and often an addiction since the beginning of recorded history. However, until the XVII century, people had wagered and played games of chance without using any system of odds to determine winnings and losings. The Greeks understood that more things might happen in the future than actually will happen. In fact, the word, εικoς, meaning plausible or probable, had the same sense as the modern concept of probability: to be expected with some degree of certainty. However, they never made any advances in measuring this degree of certainty.

Gambling in the Dark (cont.) Some Background When the Greeks wanted a prediction about what tomorrow might bring, they turned to the oracles instead of consulting their wisest philosophers. Up to the time of the Renaissance, people perceived the future as little more than a matter of luck or the result of random variations, and most of their decisions were driven by instinct. The inappropriateness of the numbering system was important reason of why Europeans were not induced to explore the mastery of risk. Without numbers, there are no odds and no probabilities; without odds and probabilities, the only way to deal with risk is to appeal to the gods and the fates.

Gambling in the Dark (cont.) Some Background When the Greeks wanted a prediction about what tomorrow might bring, they turned to the oracles instead of consulting their wisest philosophers. Up to the time of the Renaissance, people perceived the future as little more than a matter of luck or the result of random variations, and most of their decisions were driven by instinct. The inappropriateness of the numbering system was important reason of why Europeans were not induced to explore the mastery of risk. Without numbers, there are no odds and no probabilities; without odds and probabilities, the only way to deal with risk is to appeal to the gods and the fates.

Gambling in the Dark (cont.) Some Background When the Greeks wanted a prediction about what tomorrow might bring, they turned to the oracles instead of consulting their wisest philosophers. Up to the time of the Renaissance, people perceived the future as little more than a matter of luck or the result of random variations, and most of their decisions were driven by instinct. The inappropriateness of the numbering system was important reason of why Europeans were not induced to explore the mastery of risk. Without numbers, there are no odds and no probabilities; without odds and probabilities, the only way to deal with risk is to appeal to the gods and the fates.

Gambling in the Dark (cont.) Some Background When the Greeks wanted a prediction about what tomorrow might bring, they turned to the oracles instead of consulting their wisest philosophers. Up to the time of the Renaissance, people perceived the future as little more than a matter of luck or the result of random variations, and most of their decisions were driven by instinct. The inappropriateness of the numbering system was important reason of why Europeans were not induced to explore the mastery of risk. Without numbers, there are no odds and no probabilities; without odds and probabilities, the only way to deal with risk is to appeal to the gods and the fates.

Gambling in the Dark (cont.) Some Background It is hard for us to imagine a time without numbers. However, if we were able to bring a well-educated man from the year 1000 to the present, he probably would not recognize the number zero and would surely flunk third-grade arithmetic; few people from the year 1500 would fare much better. The story of numbers in the West begins in 1202, when a book titled Liber Abaci, or Book of Abacus, appeared in Italy. The author, Leonardo Pisano, was known for most of his life as Fibonacci (yes, the Fibonacci series ). It took almost three hundred years, though, for the Hindu-Arabic numbering system to be widely adopted in the Western world.

Gambling in the Dark (cont.) Some Background It is hard for us to imagine a time without numbers. However, if we were able to bring a well-educated man from the year 1000 to the present, he probably would not recognize the number zero and would surely flunk third-grade arithmetic; few people from the year 1500 would fare much better. The story of numbers in the West begins in 1202, when a book titled Liber Abaci, or Book of Abacus, appeared in Italy. The author, Leonardo Pisano, was known for most of his life as Fibonacci (yes, the Fibonacci series ). It took almost three hundred years, though, for the Hindu-Arabic numbering system to be widely adopted in the Western world.

Gambling in the Dark (cont.) Some Background It is hard for us to imagine a time without numbers. However, if we were able to bring a well-educated man from the year 1000 to the present, he probably would not recognize the number zero and would surely flunk third-grade arithmetic; few people from the year 1500 would fare much better. The story of numbers in the West begins in 1202, when a book titled Liber Abaci, or Book of Abacus, appeared in Italy. The author, Leonardo Pisano, was known for most of his life as Fibonacci (yes, the Fibonacci series ). It took almost three hundred years, though, for the Hindu-Arabic numbering system to be widely adopted in the Western world.

Gambling in the Dark (cont.) Some Background In 1494, a Franciscan monk named Luca Paccioli published his Summa de arithmetic, geometria et proportionalità. He posed the following problem in the book: A and B are playing a fair game of balla. They agree to continue until one has won six rounds. The game actually stops when A has won five and B three. How should the stakes be divided? The puzzle, which came to be known as the problem of the points, was more significant than it appears. The resolution of how to divide the stakes in an uncompleted game marked the beginning of a systematic analysis of probability.

Gambling in the Dark (cont.) Some Background In 1494, a Franciscan monk named Luca Paccioli published his Summa de arithmetic, geometria et proportionalità. He posed the following problem in the book: A and B are playing a fair game of balla. They agree to continue until one has won six rounds. The game actually stops when A has won five and B three. How should the stakes be divided? The puzzle, which came to be known as the problem of the points, was more significant than it appears. The resolution of how to divide the stakes in an uncompleted game marked the beginning of a systematic analysis of probability.

Gambling in the Dark (cont.) Some Background In 1494, a Franciscan monk named Luca Paccioli published his Summa de arithmetic, geometria et proportionalità. He posed the following problem in the book: A and B are playing a fair game of balla. They agree to continue until one has won six rounds. The game actually stops when A has won five and B three. How should the stakes be divided? The puzzle, which came to be known as the problem of the points, was more significant than it appears. The resolution of how to divide the stakes in an uncompleted game marked the beginning of a systematic analysis of probability.

Non-Degenerate Gamblers Some Background In 1654, the Chevalier de Méré, a French nobleman with a taste for both gambling and mathematics, challenged the famed French mathematician Blaise Pascal to solve the puzzle. Pascal turned for help to Pierre de Fermat, a lawyer who was also a brilliant mathematician. Their collaboration led to the discovery of the theory of probability.

Non-Degenerate Gamblers Some Background In 1654, the Chevalier de Méré, a French nobleman with a taste for both gambling and mathematics, challenged the famed French mathematician Blaise Pascal to solve the puzzle. Pascal turned for help to Pierre de Fermat, a lawyer who was also a brilliant mathematician. Their collaboration led to the discovery of the theory of probability.

Non-Degenerate Gamblers (cont.) Some Background Given that more things can happen than will happen, Pascal and Fermat established a procedure for determining the likelihood of each of the possible results assuming always that the outcomes can be measured mathematically. As the years passed, mathematicians transformed probability theory from a gamblers toy into a powerful instrument for organizing, interpreting, and applying information. In particular, they showed how to infer previously unknown probabilities from the empirical facts of reality.

Non-Degenerate Gamblers (cont.) Some Background Given that more things can happen than will happen, Pascal and Fermat established a procedure for determining the likelihood of each of the possible results assuming always that the outcomes can be measured mathematically. As the years passed, mathematicians transformed probability theory from a gamblers toy into a powerful instrument for organizing, interpreting, and applying information. In particular, they showed how to infer previously unknown probabilities from the empirical facts of reality.

Non-Degenerate Gamblers (cont.) Some Background Given that more things can happen than will happen, Pascal and Fermat established a procedure for determining the likelihood of each of the possible results assuming always that the outcomes can be measured mathematically. As the years passed, mathematicians transformed probability theory from a gamblers toy into a powerful instrument for organizing, interpreting, and applying information. In particular, they showed how to infer previously unknown probabilities from the empirical facts of reality.

Non-Degenerate Gamblers (cont.) Some Background In 1703, Gottfried von Leibniz commented to the Swiss scientist and mathematician Jacob Bernoulli that [N]ature has established patterns originating in the return of events, but only for the most part, thereby prompting Bernoulli to invent the Law of Large Numbers. In 1730, Abraham de Moivre suggested the structure of the normal distribution also known as the bell curve and discovered the concept of standard deviation.

Non-Degenerate Gamblers (cont.) Some Background In 1703, Gottfried von Leibniz commented to the Swiss scientist and mathematician Jacob Bernoulli that [N]ature has established patterns originating in the return of events, but only for the most part, thereby prompting Bernoulli to invent the Law of Large Numbers. In 1730, Abraham de Moivre suggested the structure of the normal distribution also known as the bell curve and discovered the concept of standard deviation.

Non-Degenerate Gamblers (cont.) Some Background Around 1760, a dissident English minister named Thomas Bayes made a striking advance in statistics by demonstrating how to make better-informed decisions by mathematically blending new information into old information. With some exceptions, all the tools we use today in the analysis of decisions and choice, from the strict rationality of game theory to the challenges of chaos theory, stem most from the developments that took place between 1654 and 1760.

Non-Degenerate Gamblers (cont.) Some Background Around 1760, a dissident English minister named Thomas Bayes made a striking advance in statistics by demonstrating how to make better-informed decisions by mathematically blending new information into old information. With some exceptions, all the tools we use today in the analysis of decisions and choice, from the strict rationality of game theory to the challenges of chaos theory, stem most from the developments that took place between 1654 and 1760.

Today s Class Probability Some Background In today s class we will focus the ideas developed by these remarkable thinkers. Then, we will apply the calculus of probabilities to a wide variety of situations requiring the use of sophisticated counting techniques. We shall often refer to games of chance. Yet these games have applications that extend far beyond the spin of the roulette wheel.

Today s Class Probability Some Background In today s class we will focus the ideas developed by these remarkable thinkers. Then, we will apply the calculus of probabilities to a wide variety of situations requiring the use of sophisticated counting techniques. We shall often refer to games of chance. Yet these games have applications that extend far beyond the spin of the roulette wheel.

Today s Class Probability Some Background In today s class we will focus the ideas developed by these remarkable thinkers. Then, we will apply the calculus of probabilities to a wide variety of situations requiring the use of sophisticated counting techniques. We shall often refer to games of chance. Yet these games have applications that extend far beyond the spin of the roulette wheel.

Probability 1 Probability Some Background 2 3 Conditional and Compound Probabilities Bayes Theorem Independent Events

Introduction Probability In everyday life, we often talk loosely about chance. What are the chances of getting a job? of meeting someone? of rain tomorrow? But for the purposes of this class, we need to give the word chance a definite, clear interpretation. This turns out to be hard, and, mathematicians have struggled with the job for centuries. We will now focus on the frequency theory, which works best for processes which can be repeated over and over again, independently and under the same conditions. Games of chance fail into category. And, as we already know, much of the frequency theory was developed to solve gambling problems.

Introduction Probability In everyday life, we often talk loosely about chance. What are the chances of getting a job? of meeting someone? of rain tomorrow? But for the purposes of this class, we need to give the word chance a definite, clear interpretation. This turns out to be hard, and, mathematicians have struggled with the job for centuries. We will now focus on the frequency theory, which works best for processes which can be repeated over and over again, independently and under the same conditions. Games of chance fail into category. And, as we already know, much of the frequency theory was developed to solve gambling problems.

Introduction Probability In everyday life, we often talk loosely about chance. What are the chances of getting a job? of meeting someone? of rain tomorrow? But for the purposes of this class, we need to give the word chance a definite, clear interpretation. This turns out to be hard, and, mathematicians have struggled with the job for centuries. We will now focus on the frequency theory, which works best for processes which can be repeated over and over again, independently and under the same conditions. Games of chance fail into category. And, as we already know, much of the frequency theory was developed to solve gambling problems.

Tossing Coins Probability One of the simplest games of chance involves betting on the toss of a coin. If a coin is tossed in the air, then it is certain that the coin will come down, but it is not certain that, say, it will come up head. When trying to figure chances, it is usually very helpful to list all the possible ways that a chance process can turn out. In the case of a coin toss, we ordinarily agree to regard head and tail as the only possible outcomes. If we denote these outcomes by H and T respectively, then each outcome would correspond to exactly one of the elements of the set {H, T }.

Tossing Coins Probability One of the simplest games of chance involves betting on the toss of a coin. If a coin is tossed in the air, then it is certain that the coin will come down, but it is not certain that, say, it will come up head. When trying to figure chances, it is usually very helpful to list all the possible ways that a chance process can turn out. In the case of a coin toss, we ordinarily agree to regard head and tail as the only possible outcomes. If we denote these outcomes by H and T respectively, then each outcome would correspond to exactly one of the elements of the set {H, T }.

Tossing Coins Probability One of the simplest games of chance involves betting on the toss of a coin. If a coin is tossed in the air, then it is certain that the coin will come down, but it is not certain that, say, it will come up head. When trying to figure chances, it is usually very helpful to list all the possible ways that a chance process can turn out. In the case of a coin toss, we ordinarily agree to regard head and tail as the only possible outcomes. If we denote these outcomes by H and T respectively, then each outcome would correspond to exactly one of the elements of the set {H, T }.

Tossing Coins Probability One of the simplest games of chance involves betting on the toss of a coin. If a coin is tossed in the air, then it is certain that the coin will come down, but it is not certain that, say, it will come up head. When trying to figure chances, it is usually very helpful to list all the possible ways that a chance process can turn out. In the case of a coin toss, we ordinarily agree to regard head and tail as the only possible outcomes. If we denote these outcomes by H and T respectively, then each outcome would correspond to exactly one of the elements of the set {H, T }.

Tossing Coins Probability One of the simplest games of chance involves betting on the toss of a coin. If a coin is tossed in the air, then it is certain that the coin will come down, but it is not certain that, say, it will come up head. When trying to figure chances, it is usually very helpful to list all the possible ways that a chance process can turn out. In the case of a coin toss, we ordinarily agree to regard head and tail as the only possible outcomes. If we denote these outcomes by H and T respectively, then each outcome would correspond to exactly one of the elements of the set {H, T }.

Sample Space Probability Suppose now that we repeat this experiment of tossing a coin. The process of tossing the coin can certainly be repeated over and over again, independently and under the same conditions: the outcomes of this experiment would still correspond to exactly one of the elements of the set {H, T }. This set, Ω = {H, T } is called a sample space for the experiment. Suppose now that we toss a die and observe the number that appears on top. Then the sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}.

Sample Space Probability Suppose now that we repeat this experiment of tossing a coin. The process of tossing the coin can certainly be repeated over and over again, independently and under the same conditions: the outcomes of this experiment would still correspond to exactly one of the elements of the set {H, T }. This set, Ω = {H, T } is called a sample space for the experiment. Suppose now that we toss a die and observe the number that appears on top. Then the sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}.

Sample Space Probability Suppose now that we repeat this experiment of tossing a coin. The process of tossing the coin can certainly be repeated over and over again, independently and under the same conditions: the outcomes of this experiment would still correspond to exactly one of the elements of the set {H, T }. This set, Ω = {H, T } is called a sample space for the experiment. Suppose now that we toss a die and observe the number that appears on top. Then the sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}.

Sample Space Probability Suppose now that we repeat this experiment of tossing a coin. The process of tossing the coin can certainly be repeated over and over again, independently and under the same conditions: the outcomes of this experiment would still correspond to exactly one of the elements of the set {H, T }. This set, Ω = {H, T } is called a sample space for the experiment. Suppose now that we toss a die and observe the number that appears on top. Then the sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}.

Sample Space (cont.) Definition The set Ω of all possible outcomes associated with a real or conceptual experiment is called the sample space. A particular outcome ω, i.e. an element in Ω, is called a sample point or sample.

Tossing Coins, Again Suppose now that we perform the following experiment: we toss a coin three successive times. Let Ω = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT } be the associated sample space. We may be interested in the event, the number of heads exceeds the number of tails. For any outcome of the experiment we can determine whether this event does or does not occur. We find that HHH, HHT,HTH and THH are the only elements of Ω corresponding to outcomes for which this event does occur.

Tossing Coins, Again Suppose now that we perform the following experiment: we toss a coin three successive times. Let Ω = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT } be the associated sample space. We may be interested in the event, the number of heads exceeds the number of tails. For any outcome of the experiment we can determine whether this event does or does not occur. We find that HHH, HHT,HTH and THH are the only elements of Ω corresponding to outcomes for which this event does occur.

Tossing Coins, Again Suppose now that we perform the following experiment: we toss a coin three successive times. Let Ω = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT } be the associated sample space. We may be interested in the event, the number of heads exceeds the number of tails. For any outcome of the experiment we can determine whether this event does or does not occur. We find that HHH, HHT,HTH and THH are the only elements of Ω corresponding to outcomes for which this event does occur.

Tossing Coins, Again Suppose now that we perform the following experiment: we toss a coin three successive times. Let Ω = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT } be the associated sample space. We may be interested in the event, the number of heads exceeds the number of tails. For any outcome of the experiment we can determine whether this event does or does not occur. We find that HHH, HHT,HTH and THH are the only elements of Ω corresponding to outcomes for which this event does occur.

Tossing Coins, Again Suppose now that we perform the following experiment: we toss a coin three successive times. Let Ω = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT } be the associated sample space. We may be interested in the event, the number of heads exceeds the number of tails. For any outcome of the experiment we can determine whether this event does or does not occur. We find that HHH, HHT,HTH and THH are the only elements of Ω corresponding to outcomes for which this event does occur.

Events Probability To say that the event the number of heads exceeds the number of tails occurs is the same as saying the experiment results in an outcome corresponding to an element of the set A = {HHH, HHT, HTH, THH}. Notice that A is a subset of the sample space Ω. Definition An event A is a set of outcomes or, in other words, a subset of some underlying sample space Ω.

Events Probability To say that the event the number of heads exceeds the number of tails occurs is the same as saying the experiment results in an outcome corresponding to an element of the set A = {HHH, HHT, HTH, THH}. Notice that A is a subset of the sample space Ω. Definition An event A is a set of outcomes or, in other words, a subset of some underlying sample space Ω.

Events Probability To say that the event the number of heads exceeds the number of tails occurs is the same as saying the experiment results in an outcome corresponding to an element of the set A = {HHH, HHT, HTH, THH}. Notice that A is a subset of the sample space Ω. Definition An event A is a set of outcomes or, in other words, a subset of some underlying sample space Ω.

Events (cont.) Probability The event {ω} consisting of a single sample ω Ω is called an elementary event. The certain (or sure) event, which always occurs regardless of the outcome of the experiment, is formally identical with the whole space Ω. The impossible event is the empty set, containing none of the elementary events ω.

Events (cont.) Probability The event {ω} consisting of a single sample ω Ω is called an elementary event. The certain (or sure) event, which always occurs regardless of the outcome of the experiment, is formally identical with the whole space Ω. The impossible event is the empty set, containing none of the elementary events ω.

Events (cont.) Probability The event {ω} consisting of a single sample ω Ω is called an elementary event. The certain (or sure) event, which always occurs regardless of the outcome of the experiment, is formally identical with the whole space Ω. The impossible event is the empty set, containing none of the elementary events ω.

Events (cont.) Probability We can combine events to form new events using the various set operations: (i) A 1 A 2 is the event that occurs iff A 1 occurs or A 2 occurs (or both); (ii) A 1 A 2 is the event that occurs iff A 1 occurs and A 2 occurs; (iii) A = A Ω, the complement of A, is the event that occurs iff A does not occur.

Events (cont.) Probability We can combine events to form new events using the various set operations: (i) A 1 A 2 is the event that occurs iff A 1 occurs or A 2 occurs (or both); (ii) A 1 A 2 is the event that occurs iff A 1 occurs and A 2 occurs; (iii) A = A Ω, the complement of A, is the event that occurs iff A does not occur.

Events (cont.) Probability We can combine events to form new events using the various set operations: (i) A 1 A 2 is the event that occurs iff A 1 occurs or A 2 occurs (or both); (ii) A 1 A 2 is the event that occurs iff A 1 occurs and A 2 occurs; (iii) A = A Ω, the complement of A, is the event that occurs iff A does not occur.

Events (cont.) Probability We can combine events to form new events using the various set operations: (i) A 1 A 2 is the event that occurs iff A 1 occurs or A 2 occurs (or both); (ii) A 1 A 2 is the event that occurs iff A 1 occurs and A 2 occurs; (iii) A = A Ω, the complement of A, is the event that occurs iff A does not occur.

Events (cont.) Probability Given two events A 1 and A 2, suppose A 1 occurs if and only if A 2 occurs. Then, A 1 and A 2 are said to be identical (or equivalent), and we write A 1 = A 2. Two events A 1 and A 2 are called mutually exclusive if they are disjoint, i.e. if A 1 A 2 =. In other words, A 1 and A 2 are mutually exclusive if they cannot occur simultaneously.

Events (cont.) Probability Given two events A 1 and A 2, suppose A 1 occurs if and only if A 2 occurs. Then, A 1 and A 2 are said to be identical (or equivalent), and we write A 1 = A 2. Two events A 1 and A 2 are called mutually exclusive if they are disjoint, i.e. if A 1 A 2 =. In other words, A 1 and A 2 are mutually exclusive if they cannot occur simultaneously.

Events (cont.) Probability Given two events A 1 and A 2, suppose A 1 occurs if and only if A 2 occurs. Then, A 1 and A 2 are said to be identical (or equivalent), and we write A 1 = A 2. Two events A 1 and A 2 are called mutually exclusive if they are disjoint, i.e. if A 1 A 2 =. In other words, A 1 and A 2 are mutually exclusive if they cannot occur simultaneously.

Events (cont.) Probability Given two events A 1 and A 2, suppose A 1 occurs if and only if A 2 occurs. Then, A 1 and A 2 are said to be identical (or equivalent), and we write A 1 = A 2. Two events A 1 and A 2 are called mutually exclusive if they are disjoint, i.e. if A 1 A 2 =. In other words, A 1 and A 2 are mutually exclusive if they cannot occur simultaneously.

Events: Example Probability Example Suppose that we toss a die and observe the number that appears on top. The sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}. Let A 1 be the event that an even number occurs, A 2 that an odd number occurs, and A 3 that a number higher than 3 occurs. Then, A 1 = {2, 4, 6}, A 2 = {1, 3, 5}, and A 3 = {4, 5, 6}. And, A 1 A 3 = {2, 4, 5, 6} is the event that an even number or a number higher than 3 occurs; A 1 A 3 = {4, 6} is the event that an even number higher than 3 occurs; A 3 = {1, 2, 3} is the event that a number higher than 3 does not occur.

Events: Example Probability Example Suppose that we toss a die and observe the number that appears on top. The sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}. Let A 1 be the event that an even number occurs, A 2 that an odd number occurs, and A 3 that a number higher than 3 occurs. Then, A 1 = {2, 4, 6}, A 2 = {1, 3, 5}, and A 3 = {4, 5, 6}. And, A 1 A 3 = {2, 4, 5, 6} is the event that an even number or a number higher than 3 occurs; A 1 A 3 = {4, 6} is the event that an even number higher than 3 occurs; A 3 = {1, 2, 3} is the event that a number higher than 3 does not occur.

Events: Example Probability Example Suppose that we toss a die and observe the number that appears on top. The sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}. Let A 1 be the event that an even number occurs, A 2 that an odd number occurs, and A 3 that a number higher than 3 occurs. Then, A 1 = {2, 4, 6}, A 2 = {1, 3, 5}, and A 3 = {4, 5, 6}. And, A 1 A 3 = {2, 4, 5, 6} is the event that an even number or a number higher than 3 occurs; A 1 A 3 = {4, 6} is the event that an even number higher than 3 occurs; A 3 = {1, 2, 3} is the event that a number higher than 3 does not occur.

Events: Example Probability Example Suppose that we toss a die and observe the number that appears on top. The sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}. Let A 1 be the event that an even number occurs, A 2 that an odd number occurs, and A 3 that a number higher than 3 occurs. Then, A 1 = {2, 4, 6}, A 2 = {1, 3, 5}, and A 3 = {4, 5, 6}. And, A 1 A 3 = {2, 4, 5, 6} is the event that an even number or a number higher than 3 occurs; A 1 A 3 = {4, 6} is the event that an even number higher than 3 occurs; A 3 = {1, 2, 3} is the event that a number higher than 3 does not occur.

Events: Example Probability Example Suppose that we toss a die and observe the number that appears on top. The sample space consist of the six possible numbers: Ω = {1, 2, 3, 4, 5, 6}. Let A 1 be the event that an even number occurs, A 2 that an odd number occurs, and A 3 that a number higher than 3 occurs. Then, A 1 = {2, 4, 6}, A 2 = {1, 3, 5}, and A 3 = {4, 5, 6}. And, A 1 A 3 = {2, 4, 5, 6} is the event that an even number or a number higher than 3 occurs; A 1 A 3 = {4, 6} is the event that an even number higher than 3 occurs; A 3 = {1, 2, 3} is the event that a number higher than 3 does not occur.

Events: Example (cont.) Note that A 1 and A 2 are mutually exclusive: A 1 A 2 =. In other words, an even number and an odd number cannot occur simultaneously.

Events: Example (cont.) Note that A 1 and A 2 are mutually exclusive: A 1 A 2 =. In other words, an even number and an odd number cannot occur simultaneously.

Probability 1 Probability Some Background 2 3 Conditional and Compound Probabilities Bayes Theorem Independent Events

: Definition Let Ω be a finite sample space; say, Ω = {ω 1, ω 2,..., ω n }. A finite probability space is obtained by assigning to each sample point ω i Ω a real number p i, called the probability of ω i, satisfying the following properties: (i) each p i is non-negative, p i 0 n (ii) the sum of the p i is one, p i = 1 i=1 The probability P(A) of an event A, is then defined to be the sum of the probabilities of the sample points in A. For notational convenience we write P(ω i ) for P({ω i }).

: Definition Let Ω be a finite sample space; say, Ω = {ω 1, ω 2,..., ω n }. A finite probability space is obtained by assigning to each sample point ω i Ω a real number p i, called the probability of ω i, satisfying the following properties: (i) each p i is non-negative, p i 0 n (ii) the sum of the p i is one, p i = 1 i=1 The probability P(A) of an event A, is then defined to be the sum of the probabilities of the sample points in A. For notational convenience we write P(ω i ) for P({ω i }).

: Definition Let Ω be a finite sample space; say, Ω = {ω 1, ω 2,..., ω n }. A finite probability space is obtained by assigning to each sample point ω i Ω a real number p i, called the probability of ω i, satisfying the following properties: (i) each p i is non-negative, p i 0 n (ii) the sum of the p i is one, p i = 1 i=1 The probability P(A) of an event A, is then defined to be the sum of the probabilities of the sample points in A. For notational convenience we write P(ω i ) for P({ω i }).

: Definition Let Ω be a finite sample space; say, Ω = {ω 1, ω 2,..., ω n }. A finite probability space is obtained by assigning to each sample point ω i Ω a real number p i, called the probability of ω i, satisfying the following properties: (i) each p i is non-negative, p i 0 n (ii) the sum of the p i is one, p i = 1 i=1 The probability P(A) of an event A, is then defined to be the sum of the probabilities of the sample points in A. For notational convenience we write P(ω i ) for P({ω i }).

: Definition Let Ω be a finite sample space; say, Ω = {ω 1, ω 2,..., ω n }. A finite probability space is obtained by assigning to each sample point ω i Ω a real number p i, called the probability of ω i, satisfying the following properties: (i) each p i is non-negative, p i 0 n (ii) the sum of the p i is one, p i = 1 i=1 The probability P(A) of an event A, is then defined to be the sum of the probabilities of the sample points in A. For notational convenience we write P(ω i ) for P({ω i }).

: Example Example Let three coins be tossed and the number of heads observed; then Ω = {0, 1, 2, 3} is a sample space associated with this experiment. We obtain a probability space by the following assignment, P(0) = 1 8, P(1) = 3 8, P(2) = 3 8, and P(3) = 1 8 since each probability is nonnegative and the sum of the probabilities is 1. Let now A 1 be the event that at least one head appears and let A 2 be the event that all heads or all tails appear: A 1 = {1, 2, 3} and A 2 = {0, 3}. Then, by definition, P(A 1 ) = P(1) + P(2) + P(3) = 3 8 + 3 8 + 1 8 = 7 8 and P(A 2 ) = P(0) + P(3) = 1 8 + 1 8 = 1 4.

: Example Example Let three coins be tossed and the number of heads observed; then Ω = {0, 1, 2, 3} is a sample space associated with this experiment. We obtain a probability space by the following assignment, P(0) = 1 8, P(1) = 3 8, P(2) = 3 8, and P(3) = 1 8 since each probability is nonnegative and the sum of the probabilities is 1. Let now A 1 be the event that at least one head appears and let A 2 be the event that all heads or all tails appear: A 1 = {1, 2, 3} and A 2 = {0, 3}. Then, by definition, P(A 1 ) = P(1) + P(2) + P(3) = 3 8 + 3 8 + 1 8 = 7 8 and P(A 2 ) = P(0) + P(3) = 1 8 + 1 8 = 1 4.

: Example Example Let three coins be tossed and the number of heads observed; then Ω = {0, 1, 2, 3} is a sample space associated with this experiment. We obtain a probability space by the following assignment, P(0) = 1 8, P(1) = 3 8, P(2) = 3 8, and P(3) = 1 8 since each probability is nonnegative and the sum of the probabilities is 1. Let now A 1 be the event that at least one head appears and let A 2 be the event that all heads or all tails appear: A 1 = {1, 2, 3} and A 2 = {0, 3}. Then, by definition, P(A 1 ) = P(1) + P(2) + P(3) = 3 8 + 3 8 + 1 8 = 7 8 and P(A 2 ) = P(0) + P(3) = 1 8 + 1 8 = 1 4.

: Example Example Let three coins be tossed and the number of heads observed; then Ω = {0, 1, 2, 3} is a sample space associated with this experiment. We obtain a probability space by the following assignment, P(0) = 1 8, P(1) = 3 8, P(2) = 3 8, and P(3) = 1 8 since each probability is nonnegative and the sum of the probabilities is 1. Let now A 1 be the event that at least one head appears and let A 2 be the event that all heads or all tails appear: A 1 = {1, 2, 3} and A 2 = {0, 3}. Then, by definition, P(A 1 ) = P(1) + P(2) + P(3) = 3 8 + 3 8 + 1 8 = 7 8 and P(A 2 ) = P(0) + P(3) = 1 8 + 1 8 = 1 4.

Assignment of Probabilities and Note that the probability of an event depends on the previous assignment of probabilities to the sample points. The following question then arises: which assignment of probabilities to sample points should be made? The answer to this question is not a mathematical one. Rather, it depends upon our assessment of the real-world situation to which the theory is to be applied.

Assignment of Probabilities and Note that the probability of an event depends on the previous assignment of probabilities to the sample points. The following question then arises: which assignment of probabilities to sample points should be made? The answer to this question is not a mathematical one. Rather, it depends upon our assessment of the real-world situation to which the theory is to be applied.

Assignment of Probabilities and Note that the probability of an event depends on the previous assignment of probabilities to the sample points. The following question then arises: which assignment of probabilities to sample points should be made? The answer to this question is not a mathematical one. Rather, it depends upon our assessment of the real-world situation to which the theory is to be applied.

Axioms of Probability Let Ω be a sample space, let C be the class of all events, and let P be a real-valued function defined on C. Then P is called a probability function, and P(A) is called the probability of the event A if the following axioms hold: 1. The probability of a certain event is 1, P(Ω) = 1. 2. If A is any event, then 0 P(A) 1. 3. If A 1 and A 2 are mutually exclusive events, then P(A 1 A 2 ) = P(A 1 ) + P(A 2 ).

Axioms of Probability Let Ω be a sample space, let C be the class of all events, and let P be a real-valued function defined on C. Then P is called a probability function, and P(A) is called the probability of the event A if the following axioms hold: 1. The probability of a certain event is 1, P(Ω) = 1. 2. If A is any event, then 0 P(A) 1. 3. If A 1 and A 2 are mutually exclusive events, then P(A 1 A 2 ) = P(A 1 ) + P(A 2 ).

Axioms of Probability Let Ω be a sample space, let C be the class of all events, and let P be a real-valued function defined on C. Then P is called a probability function, and P(A) is called the probability of the event A if the following axioms hold: 1. The probability of a certain event is 1, P(Ω) = 1. 2. If A is any event, then 0 P(A) 1. 3. If A 1 and A 2 are mutually exclusive events, then P(A 1 A 2 ) = P(A 1 ) + P(A 2 ).

Axioms of Probability Let Ω be a sample space, let C be the class of all events, and let P be a real-valued function defined on C. Then P is called a probability function, and P(A) is called the probability of the event A if the following axioms hold: 1. The probability of a certain event is 1, P(Ω) = 1. 2. If A is any event, then 0 P(A) 1. 3. If A 1 and A 2 are mutually exclusive events, then P(A 1 A 2 ) = P(A 1 ) + P(A 2 ).

Axioms of Probability Let Ω be a sample space, let C be the class of all events, and let P be a real-valued function defined on C. Then P is called a probability function, and P(A) is called the probability of the event A if the following axioms hold: 1. The probability of a certain event is 1, P(Ω) = 1. 2. If A is any event, then 0 P(A) 1. 3. If A 1 and A 2 are mutually exclusive events, then P(A 1 A 2 ) = P(A 1 ) + P(A 2 ).

Axioms of Probability (cont.) More generally, given n mutually exclusive events A 1, A 2,..., A n, we have the formula P( n k=1 A k ) = n P(A k ). k=1 This equation is called the addition rule for probabilities. We can now prove a few theorems which follow directly from our axioms.

Axioms of Probability (cont.) More generally, given n mutually exclusive events A 1, A 2,..., A n, we have the formula P( n k=1 A k ) = n P(A k ). k=1 This equation is called the addition rule for probabilities. We can now prove a few theorems which follow directly from our axioms.

Axioms of Probability (cont.) Theorem If is the empty set, then P( ) = 0. Proof. Let A be any set; then A and are disjoint and A = A. By axiom 3, P(A) = P(A ) = P(A) + P( ) Subtracting P(A) from both sides gives our result.

Axioms of Probability (cont.) Theorem If is the empty set, then P( ) = 0. Proof. Let A be any set; then A and are disjoint and A = A. By axiom 3, P(A) = P(A ) = P(A) + P( ) Subtracting P(A) from both sides gives our result.

Axioms of Probability (cont.) Theorem If A is the complement of an event A, then P(A ) = 1 P(A). In words, the probability that A does not occur is obtained by subtracting from 1 the probability that A does occur. Proof. The sample space Ω can be decomposed into mutually exclusive events A and A ; that is, Ω = A A. By axioms 1 and 3 we obtain 1 = P(Ω) = P(A A ) = P(A) + P(A ) from which our result follows.

Axioms of Probability (cont.) Theorem If A is the complement of an event A, then P(A ) = 1 P(A). In words, the probability that A does not occur is obtained by subtracting from 1 the probability that A does occur. Proof. The sample space Ω can be decomposed into mutually exclusive events A and A ; that is, Ω = A A. By axioms 1 and 3 we obtain 1 = P(Ω) = P(A A ) = P(A) + P(A ) from which our result follows.

Axioms of Probability (cont.) Theorem If A is the complement of an event A, then P(A ) = 1 P(A). In words, the probability that A does not occur is obtained by subtracting from 1 the probability that A does occur. Proof. The sample space Ω can be decomposed into mutually exclusive events A and A ; that is, Ω = A A. By axioms 1 and 3 we obtain 1 = P(Ω) = P(A A ) = P(A) + P(A ) from which our result follows.

Axioms of Probability (cont.) Theorem If A 1 A 2, then P(A 1 ) P(A 2 ). In words, if A 1 implies A 2, then the probability of A 1 cannot exceed the probability of A 2. Proof. If A 1 A 2, then A 2 can be decomposed into the mutually exclusive events A 1 and A 2 \ A 1. Therefore, P(A 2 ) = P(A 1 ) + P(A 2 \ A 1 ) The result now follows from the fact that P(A 2 \ A 1 ) 1.

Axioms of Probability (cont.) Theorem If A 1 A 2, then P(A 1 ) P(A 2 ). In words, if A 1 implies A 2, then the probability of A 1 cannot exceed the probability of A 2. Proof. If A 1 A 2, then A 2 can be decomposed into the mutually exclusive events A 1 and A 2 \ A 1. Therefore, P(A 2 ) = P(A 1 ) + P(A 2 \ A 1 ) The result now follows from the fact that P(A 2 \ A 1 ) 1.