PHYSICS 140A : STATISTICAL PHYSICS HW ASSIGNMENT #1 SOLUTIONS

Similar documents
Important Distributions 7/17/2006

Solutions 2: Probability and Counting

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Functions of several variables

MAT3707. Tutorial letter 202/1/2017 DISCRETE MATHEMATICS: COMBINATORICS. Semester 1. Department of Mathematical Sciences MAT3707/202/1/2017

LISTING THE WAYS. getting a total of 7 spots? possible ways for 2 dice to fall: then you win. But if you roll. 1 q 1 w 1 e 1 r 1 t 1 y

Solutions to Information Theory Exercise Problems 5 8

High School Mathematics Contest

MATH 105: Midterm #1 Practice Problems

Practice Test 3 (longer than the actual test will be) 1. Solve the following inequalities. Give solutions in interval notation. (Expect 1 or 2.

Communication Theory II

CSCI 2200 Foundations of Computer Science (FoCS) Solutions for Homework 7

Statistics Laboratory 7

Transmit Power Allocation for BER Performance Improvement in Multicarrier Systems

Multiple-Angle and Product-to-Sum Formulas

Probability. The MEnTe Program Math Enrichment through Technology. Title V East Los Angeles College

More Probability: Poker Hands and some issues in Counting

Week in Review #5 ( , 3.1)

Math 3338: Probability (Fall 2006)

FUNCTIONS OF SEVERAL VARIABLES AND PARTIAL DIFFERENTIATION

Name Class Date. Introducing Probability Distributions

Pulse Code Modulation

Directions: Show all of your work. Use units and labels and remember to give complete answers.

ECS 20 (Spring 2013) Phillip Rogaway Lecture 1

ORDER AND CHAOS. Carl Pomerance, Dartmouth College Hanover, New Hampshire, USA

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Compound Probability. Set Theory. Basic Definitions

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Review guide for midterm 2 in Math 233 March 30, 2009

Problem Set 8 Solutions R Y G R R G

Discrete probability and the laws of chance

MAT104: Fundamentals of Mathematics II Summary of Counting Techniques and Probability. Preliminary Concepts, Formulas, and Terminology

Exam 2 Summary. 1. The domain of a function is the set of all possible inputes of the function and the range is the set of all outputs.

The next several lectures will be concerned with probability theory. We will aim to make sense of statements such as the following:

Lecture 13: Physical Randomness and the Local Uniformity Principle

Theory of Probability - Brett Bernstein

18.S34 (FALL, 2007) PROBLEMS ON PROBABILITY

Problem Set 8 Solutions R Y G R R G

Bell Work. Warm-Up Exercises. Two six-sided dice are rolled. Find the probability of each sum or 7

Hypergeometric Probability Distribution

MA 524 Midterm Solutions October 16, 2018

MATH Exam 2 Solutions November 16, 2015

Simulations. 1 The Concept

Twenty-fourth Annual UNC Math Contest Final Round Solutions Jan 2016 [(3!)!] 4

LECTURE 7: POLYNOMIAL CONGRUENCES TO PRIME POWER MODULI

Diamond ( ) (Black coloured) (Black coloured) (Red coloured) ILLUSTRATIVE EXAMPLES

Block 1 - Sets and Basic Combinatorics. Main Topics in Block 1:

Teaching Randomness Using Coins and Dice

DICE GAMES WASHINGTON UNIVERSITY MATH CIRCLE --- FEBRUARY 12, 2017

CIS 2033 Lecture 6, Spring 2017

Chapter 1. Probability

Combinatorics and Intuitive Probability

Research Article n-digit Benford Converges to Benford

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

November 6, Chapter 8: Probability: The Mathematics of Chance

Probability MAT230. Fall Discrete Mathematics. MAT230 (Discrete Math) Probability Fall / 37

MATH 20C: FUNDAMENTALS OF CALCULUS II FINAL EXAM

Math 1313 Section 6.2 Definition of Probability

From Fountain to BATS: Realization of Network Coding

Solutions to Exercises Chapter 6: Latin squares and SDRs

INDEPENDENT AND DEPENDENT EVENTS UNIT 6: PROBABILITY DAY 2

Chapter 1. Probability

3. Discrete Probability. CSE 312 Spring 2015 W.L. Ruzzo

STAT 430/510 Probability Lecture 3: Space and Event; Sample Spaces with Equally Likely Outcomes

Class 8: Factors and Multiples (Lecture Notes)

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

STANDARD COMPETENCY : 1. To use the statistics rules, the rules of counting, and the characteristic of probability in problem solving.

33. Riemann Summation over Rectangular Regions

Lecture 2: Sum rule, partition method, difference method, bijection method, product rules

5.1, 5.2, 5.3 Properites of Exponents last revised 12/28/2010

Lecture5: Lossless Compression Techniques

November 8, Chapter 8: Probability: The Mathematics of Chance

CS1802 Week 9: Probability, Expectation, Entropy


2.1 BASIC CONCEPTS Basic Operations on Signals Time Shifting. Figure 2.2 Time shifting of a signal. Time Reversal.

1. A factory makes calculators. Over a long period, 2 % of them are found to be faulty. A random sample of 100 calculators is tested.

Math 148 Exam III Practice Problems

VU Signal and Image Processing. Torsten Möller + Hrvoje Bogunović + Raphael Sahann

Random Card Shuffling

Digital data (a sequence of binary bits) can be transmitted by various pule waveforms.

Speech Coding in the Frequency Domain

n! = n(n 1)(n 2) 3 2 1

CHAPTER 6 PROBABILITY. Chapter 5 introduced the concepts of z scores and the normal curve. This chapter takes

CSC/MATA67 Tutorial, Week 12

ECE 302 Homework Assignment 2 Solutions

CSE 312: Foundations of Computing II Quiz Section #2: Inclusion-Exclusion, Pigeonhole, Introduction to Probability (solutions)

Laboratory 1: Uncertainty Analysis

Example 1. An urn contains 100 marbles: 60 blue marbles and 40 red marbles. A marble is drawn from the urn, what is the probability that the marble

Outline. Noise and Distortion. Noise basics Component and system noise Distortion INF4420. Jørgen Andreas Michaelsen Spring / 45 2 / 45

The Chain Rule, Higher Partial Derivatives & Opti- mization

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 13

Name: Exam 01 (Midterm Part 2 Take Home, Open Everything)

Solutions to Problem Set 7

First Practice Test 1 Levels 5-7 Calculator not allowed

Shuffling with ordered cards

The topic for the third and final major portion of the course is Probability. We will aim to make sense of statements such as the following:

The Teachers Circle Mar. 20, 2012 HOW TO GAMBLE IF YOU MUST (I ll bet you $5 that if you give me $10, I ll give you $20.)

INDIAN STATISTICAL INSTITUTE

12.1 The Fundamental Counting Principle and Permutations

November 11, Chapter 8: Probability: The Mathematics of Chance

Transcription:

PHYSICS 40A : STATISTICAL PHYSICS HW ASSIGNMENT # SOLUTIONS () The information entropy of a distribution {p n } is defined as S n p n log 2 p n, where n ranges over all possible configurations of a given physical system and p n is the probability of the state n. If there are Ω possible states and each state is equally likely, then S log 2 Ω, which is the usual dimensionless entropy in units of ln 2. Consider a normal deck of 52 distinct playing cards. A new deck always is prepared in the same order (A 2 K ). (a) What is the information entropy of the distribution of new decks? (b) What is the information entropy of a distribution of completely randomized decks? Now consider what it means to shuffle the cards. In an ideal riffle shuffle, the deck is split and divided into two equal halves of 26 cards each. One then chooses at random whether to take a card from either half, until one runs through all the cards and a new order is established (see figure). Figure : The riffle shuffle. (c) What is the increase in information entropy for a distribution of new decks that each have been shuffled once? (d) Assuming each subsequent shuffle results in the same entropy increase (i.e. neglecting redundancies), how many shuffles are necessary in order to completely randomize a deck? (e) If in parts (b), (c), and (d), you were to use Stirling s approximation, how would your answers have differed? K! K K e K K,

Solution : (a) Since each new deck arrives in the same order, we have p while p 2,...,52! 0. Therefore S 0. (b) For completely randomized decks, p n /Ω with n {,...,Ω} and Ω 52!, the total number of possible configurations. Thus, S random log 2 52! 225.58. (c) After one riffle shuffle, there are Ω ( 52 26) possible configurations. If all such configurations were equally likely, we would have ( S) riffle ( log 52 2 26) 48.87. However, they are not all equally likely. For example, the probability that we drop the entire left-half deck and then the entire right half-deck is 2 26. After the last card from the left half-deck is dropped, we have no more choices to make. On the other hand, the probability for the sequence LRLR is 2 5, because it is only after the 5 st card is dropped that we have no more choices. We can derive an exact expression for the entropy of the riffle shuffle in the following manner. Consider a deck of N 2K cards. The probability that we run out of choices after K cards is the probability of the first K cards dropped being all from one particular half-deck, which is 2 2 K. Now let s ask what is the probability that we run out of choices after (K + ) cards are dropped. If all the remaining (K ) cards are from the right half-deck, this means that we must have one of the R cards among the first K dropped. Note that this R card cannot be the (K + ) th card dropped, since then all of the first K cards are L, which we have already considered. Thus, there are ( K ) K such configurations, each with a probability 2 K. Next, suppose we run out of choices after (K + 2) cards are dropped. If the remaining (K 2) cards are R, this means we must have 2 of the R cards among the first (K + ) dropped, which means ( K+) 2 possibilities. Note that the (K + 2) th card must be L, since if it were R this would mean that the last (K ) cards are R, which we have already considered. Continuing in this manner, we conclude and S K Ω K a Ω K 2 K ( ) K + n n n0 p a log 2 p a K n0 ( K + n n ( ) 2K K ) 2 (K+n) (K + n). The results are tabulated below in Table. For a deck of 52 cards, the actual entropy per riffle shuffle is S 26 46.274. (d) Ignoring redundancies, we require k S random /( S) riffle 4.62 shuffles if we assume all riffle outcomes are equally likely, and 4.88 if we use the exact result for the riffle entropy. Since there are no fractional shuffles, we round up to k 5 in both cases. In fact, computer experiments show that the answer is k 9. The reason we are so far off is that we have ignored redundancies, i.e. we have assumed that all the states produced by two consecutive riffle shuffles are distinct. They are not! For decks with asymptotically large 2

K Ω K S K ( log 2K ) 2 K 2 6 2.500 2.585 2 270456 20.32 20.367 26 4.96 0 4 46.274 48.87 00 9.05 0 58 88.730 95.85 Table : Riffle shuffle results. numbers of cards N, the number of riffle shuffles required is k 3 2 log 2 N. See D. Bayer and P. Diaconis, Annals of Applied Probability 2, 294 (992). (e) Using the first four terms of Stirling s approximation of ln K, i.e. out to O(K 0 ), we find log 2 52! 225.579 and log 2 ( 52 26) 48.824. (2) In problem #, we ran across Stirling s approximation, lnk! K ln K K + 2 ln(k) + O( K ), for large K. In this exercise, you will derive this expansion. (a) Start by writing K! 0 dx x K e x, and define x K(t + ) so that K! K K+ e K F(K), where Find the function f(t). F(K) dt e Kf(t). (b) Expand f(t) n0 f n t n in a Taylor series and find a general formula for the expansion coefficients f n. In particular, show that f 0 f 0 and that f 2 2. (c) If one ignores all the terms but the lowest order (quadratic) in the expansion of f(t), show that dt e Kt2 /2 K R(K), and show that the remainder R(K) > 0 is bounded from above by a function which decreases faster than any polynomial in /K. (d) For the brave only! Find the O ( K ) term in the expansion for ln K!. 3

Solution : (a) Setting x K(t + ), we have K! K K+ e K dt (t + ) K e t, hence f(t) ln(t + ) t. (b) The Taylor expansion of f(t) is f(t) 2 t2 + 3 t3 4 t4 +.... (c) Retaining only the leading term in the Taylor expansion of f(t), we have F(K) dt e Kt2 /2 K Writing t s +, the remainder is found to be R(K) e K/2 0 dt e Kt2 /2. ds e Ks2 /2 e Ks < which decreases exponentially with K, faster than any power. (d) We have F(K) dt e 2 Kt2 e 3 Kt3 4 Kt4 +... π 2K e K/2, dt e 2 Kt2{ + 3 Kt3 4 Kt4 + 8 K2 t 6 +... K { 34 K + 56 K + O ( K 2)} } Thus, lnk! K ln K K + 2 ln K + 2 ln() + 2 K + O ( K 2). 4

(3) A six-sided die is loaded so that the probability to throw a three is twice that of throwing a two, and the probability of throwing a four is twice that of throwing a five. (a) Find the distribution {p n } consistent with maximum entropy, given these constraints. (b) Assuming the maximum entropy distribution, given two such identical dice, what is the probability to roll a total of seven if both are thrown simultaneously? Solution : (a) We have the following constraints: We define X 0 (p) p + p 2 + p 3 + p 4 + p 5 + p 6 0 X (p) p 3 2p 2 0 X 2 (p) p 4 2p 5 0. S (p,λ) n p n ln p n 2 λ a X (a) (p), and freely extremize over the probabilities {p,...,p 6 } and the undetermined Lagrange multipliers {λ 0,λ,λ 2 }. We obtain a0 p ln p λ 0 p 2 ln p 2 λ 0 + 2λ p 3 ln p 3 λ 0 λ p 4 lnp 4 λ 0 λ 2 p 5 lnp 5 λ 0 + 2λ 2 p 6 lnp 6 λ 0. Extremizing with respect to the undetermined multipliers generates the three constraint equations. We therefore have p e λ 0 p 4 e λ 0 e λ 2 p 2 e λ 0 e 2λ p 5 e λ 0 e 2λ 2 p 3 e λ 0 e λ p 6 e λ 0. We solve for {λ 0,λ,λ 2 } by imposing the three constraints. Let x p p 6 e λ 0. Then p 2 xe 2λ, p3 xe λ, p4 xe λ 2, and p5 xe 2λ 2. We then have p 3 2p 2 e 3λ 2 p 4 2p 5 e 3λ 2 2. 5

We may now solve for x: 6 p n ( 2 + 2 /3 + 2 4/3) x x n We now have all the probabilities: 2 + 3 2 /3. p x 0.730 p 4 2 /3 x 0.280 p 2 2 2/3 x 0.090 p 5 2 2/3 x 0.090 p 3 2 /3 x 0.280 p 6 x 0.730. (b) The probability to roll a seven with two of these dice is P(7) 2p p 6 + 2p 2 p 5 + 2p 3 p 4 2 ( + 2 4/3 + 2 2/3) x 2 0.787. (4) The probability density for a random variable x is given by the Lorentzian, P(x) γ π x 2 + γ 2. Consider the sum X N N i x i, where each x i is independently distributed according to P(x i ). Find the probability Π N (Y ) that X N < Y, where Y > 0 is arbitrary. Solution : As discussed in the Lecture Notes.4.2, the distribution of a sum of identically distributed random variables, X N i x i, is given by P N (X) dk [ ] N ˆP(k) e ikx, where ˆP(k) is the Fourier transform of the probability distribution P(x i ) for each of the x i. The Fourier transform of a Lorentzian is an exponential: Thus, dx P(x)e ikx e γ k. P N (X) dk e Nγ k e ikx Nγ π X 2 + N 2 γ 2. 6

The probability for X to lie in the interval X [ Y,Y ], where Y > 0, is Π N (Y ) Y Y dx P N (X) 2 ( ) Y π tan. Nγ The integral is easily performed with the substitution X Nγ tan θ. Note that Π N (0) 0 and Π N ( ). 7