PROBLEM SET 1 1. (Geanokoplos, 1992) Imagine three girls sitting in a circle, each wearing either a red hat or a white hat. Each girl can see the color of the hat of the other two girls, but not the color of her own hat. Suppose that the teacher chooses a red hat for all the three girls. When the teacher asks if any student can identify the color of her own hat, the answer is always negative, since nobody can see her own hat. Then the teacher remarks that there is at least one person using a red hat, a fact which is known to all of them (they see two other girls wearing a red hat), and asks again each student if they can identify their own hat. What are the answers given by the first student, the second student, and the third student, if they are asked sequentially and the answers are perfectly observable? 2. (Geanokoplos, 1992) An honest but mischievous father tells his two sons that he has placed 10 n dollars in one envelope and 10 n+1 dollars in the other envelope, where n is chosen with equal probability among the integers between 1 and 6. The sons completely believe their father. He randomly hands each son an envelope. The first son looks inside his envelope and finds $10,000. Disappointed at the meager amount, he calculates that the odds are fifty-fifty that he has the smaller amount in his envelope. Since the other envelope contains either $1,000 or $100,000 with equal probability, the first son realizes that the expected amount in the other envelope is $50,500. The second son finds only $1,000 in his envelope. Based on his information, he expects to find either $100 or $10,000 in the first son s envelope, which at equal odds come to an expectation of $5,050. The father privately asks each son whether he would be willing to pay $1 to switch envelopes, in effect betting that the other envelope has more money. Both sons say yes. The father then tells each son what his brother said and repeats the question. Again both say yes. The father relays the brothers answers and asks each a third time whether he is willing to pay $1 to switch envelopes. Again both say yes. But if the father relays their answers and asks each a fourth time, the son with $1,000 will say yes, but the son with $10,000 will say no. Why? 3. There once was a small island whose inhabitants had a strange tradition: anyone who discovered his or her own eye color was obligated to commit ritual suicide by jumping off a cliff at midnight on the day the discovery was made. Due to this tradition, the topic of eye color on the island was taboo. Everyone on the island knew everyone else s eye color. (In fact, because it was such a small island, everyone knew just about everything about everyone else.) But no one knew his or her own eye color, and so the island s seven blue-eyed inhabitants lived in peace and happiness with their fifteen green-eyed compatriots and their three hundred brown-eyed compatriots. All went smoothly until one visiting stranger, unfamiliar with the taboos of the island, made a shocking declaration: Some of you have blue eyes. The stranger spoke loudly, in the middle of the town square, where it was clear everyone could hear. At midnight six days later, all the blue-eyed people killed themselves. What the stranger said wasn t news: even before he spoke, everyone knew that some of the island s inhabitants had blue eyes. So how did his utterance enable the blue-eyed people to learn their own eye color? (You may assume that strangers are commonly known to be truthful.)
4. Two distinct proposals, A and B, are being debated in Washington. The Congress likes A and the President likes B. The proposals are not mutually exclusive; either or both or neither may become law. Congress and the President receive the following payoffs from the four possible alternatives: Congress President A becomes law (4 for congress) (1 for president) B becomes law (1 for congress) (4 for president) Both A and B become law (3 for congress) (3 for president) Neither A nor B become law (2 for congress) (2 for president) (a) Suppose the game has the following structure. First, Congress decides whether to pass a bill and whether it contains just A, just B, or both. Then the President decides whether to sign the bill into law or to veto the bill. Congress does not have enough votes to override a veto. Draw the game tree. Find the Nash equilibrium in this game. (b) Now suppose the President is given the extra power of a line item veto. Thus the President can sign an entire bill into law, veto an entire bill, or veto just part of a bill and sign the other part into law. As before, Congress does not have enough votes to override a veto. Draw the game tree. Find the Nash equilibrium in this game. 5. Consider a game similar to the one we played in class: two players alternate choosing numbers. These numbers are added to a running total. When the running total reaches x or more, the game ends. The winner is the player whose choice of number takes the total to exactly x or more. The difference now is that each player can use each number only once, i.e., each player s strategy space shrinks as the game progresses. If the initial strategy space of each player i is Si = {1, 2, 3,, y}, answer the following questions: a) For any given y, what is the maximum x that can be chosen? b) For any given y and feasible x, do your best to find the Nash Equilibrium of this game. Then, specify under what conditions we should expect either a first-mover advantage or a second-mover advantage. 6. (Alternating-Offer Bargaining) In this game, Players 1 and 2 take turns. Similarly to what happens in the Ultimatum Game, Player 1 receives a certain endowment, and has to make an offer to Player 2. The difference now is that, if Player 2 rejects the offer, instead of both getting zero, the size of the pie being divided shrinks, and Player 2 makes a counteroffer to Player 1. If Player 1 rejects the counteroffer, then the game is over and neither player gets anything. If the payoffs in each round of this two-round game are $5.00 and $2.75, what is the equilibrium of this game, assuming that both players are selfish? What if we had three rounds (player 1, player 2, player 1), with payoffs $5.00, $2.50, and $1.25? And what if we had five rounds (player 1, player 2, player 1, player 2, player 1) with payoffs of $5.00, $1.70, $0.58, $0.20, and $0.07? What if the initial endowment is 1, the discount factor in each period is δ, and the game is played for n rounds?
7. (Strategic Voting) Three legislators are voting to give themselves a pay raise. The raise is worth b, but each legislator who votes for the raise incurs a cost of voter resentment equal to c < b. The outcome is decided by majority rule. a) Draw a game tree for the problem, assuming the legislators vote sequentially and publicly. b) Find the equilibrium of this game using backward induction. Is it better to go first? 8. Three gangsters armed with pistols, Al, Bob, and Curly, are in a room with a suitcase containing 120,000 dollars. Al is the least accurate, with a 20% chance of killing his target. Bob has a 40% probability. Curly is slow but sure; he kills his target with 70% probability. For each, the value of his own life outweighs the value of any amount of money. Survivors split the money. a) Suppose each gangster has one bullet and the order of shooting is first Al, then Bob, then Curly. Assume also that each gangster must try to kill another gangster when his turn comes. What is an equilibrium strategy profile and what is the probability that each of them dies in equilibrium? b) Suppose now that each gangster has the additional option of shooting his gun at the ceiling, which may kill somebody upstairs but has no direct effect on his payoff. Does the strategy profile that you found was an equilibrium in part (a) remain an equilibrium? Why? Why not? c) Now generalize this game. Assume that Al kills his target with probability p, Bob kills his target with probability q, and Curly kills his target with probability r. The option of shooting the ceiling is available to all players. Find all the possible equilibria for this game. d) In the generalized game, what is the probability that each player dies in equilibrium? If you had to choose between being the first-mover, the second-mover, or the thirdmover in this game, what would you choose? e) What is the equilibrium of the generalized game if you add a fourth player, Dan, who moves after Curly if he is still alive and kills his target with probability s? What is the equilibrium of this game? What if you keep adding more players? f) In the generalized game with three players (Al, Bob, and Curly), what is the equilibrium if a second round is allowed for those who are still alive? And what is the equilibrium if the game continues until there is only one man standing? 9. The five Dukes of Earl are scheduled to arrive at the royal palace on each of the first five days of May. Duke One is scheduled to arrive on the first day of May, Duke Two on the second, etc. Each Duke, upon arrival, can either kill the king or support the king. If he kills the king, he takes the king s place, becomes the new king, and awaits the next Duke s arrival. If he supports the king, all subsequent Dukes cancel their visits. A Duke s first priority is to remain alive, and his second priority is to become king. Who is king on May 6? 10. (Brams and Straffin, 1979) In the draft system used in football, basketball, and other professional sports in the United States, the teams having the worst win-loss records in the previous season get first pick of the new players in the draft. The worst team gets first choice, the next-worst team second choice, and so on. After each team has made a draft choice, the
procedure is repeated, round by round, until every team has exhausted its choices. Presumably, this system, by giving the worst teams priority in the selection process, makes the teams more competitive the next season. Consider a game with only two teams (A and B) and four players in the draft (1,2,3,4), with preference ordering as follows: A = {1,2,3,4}and B = {2,3,4,1}. Assume that the both sets of preferences are common knowledge. Team A is the first-mover. Given this information, answer the following questions: a) If both teams choose sincerely, what is the outcome of the game? Is it Pareto Optimal? Why or why not? b) To choose sincerely, however, is not an equilibrium in this game. If both teams act rationally, what is the equilibrium of the game? Is it Pareto Optimal? Why or why not? c) Can you find an algorithm to find the equilibrium of the game when the number of teams is 2? d) Now consider the same game with three teams (A,B,C) and six players (1,2,3,4,5,6). Each team preference ordering is given by: A = {1,2,3,4,5,6}, B = {5,6,2,1,4,3}, and C = {3,6,5,4,1,2}. Team A moves first, team B moves second, and team C moves last. If the teams choose sincerely, what is the outcome of the game? Is it Pareto Optimal? Why or why not? e) If the teams act strategically, what is the equilibrium of the game? Is it Pareto Optimal? Why or why not? f) Now assume that B moves first, C moves second, A moves last, and they behave rationally. What is the equilibrium of the game? Compare this result with your findings from part (e). What do you conclude? 11. (A very simple model of team production) Analyze a similar version of the game that was discussed in class involving two workers who have to choose sequentially whether to exert a high level or a low level of effort. The benefit that each worker gets is equal to n x, where n is the number of workers choosing a high level of effort, and x > 1. The cost of providing a high level of effort is equal to c h and the cost of choosing a low level of effort is equal to c l, where c h > c l. a) What is the variable x capturing? Why is it greater than 1? b) Check for all possible Nash Equilibria. c) What happens to the probability of both workers choosing high effort when x increases? Explain the intuition. d) Assume that the manager of the firm moves last (after the two workers chose their respective levels of effort) and chooses to implement a fine f on those workers who chose a low level of effort. What is the minimum fine that guarantees high effort from both workers? How does the size of the fine change with c h, c l, and x? e) Now add a third worker (without a manager). Which situation are we more likely to see all workers exerting high effort? With 2 workers or with 3? Explain the intuition. 12. An incumbent in an industry faces the possibility of entry by a challenger. First the challenger chooses whether or not to enter. If it does not enter, neither firm has any further action; the incumbent s payoff is TM (it obtains the profit M in each of the following T 1 periods) and the challenger s payoff is 0. If the challenger enters, it pays the entry cost f > 0, and in each of T periods the incumbent first commits to fight or cooperate with the challenger in that
period, then the challenger chooses whether to stay in the industry or to exit. If, in any period, the challenger stays in, each firm obtains in that period the profit F < 0 if the incumbent fights and C > max{f, f } if it cooperates. If, in any period, the challenger exits, both firms obtain the profit zero in that period (regardless of the incumbent s action); the incumbent obtains the profit M > 2C and the challenger the profit 0 in every subsequent period. Once the challenger exits, it cannot subsequently re-enter. Each firm cares about the sum of its profits. a) Find the equilibria of the game that models this situation. b) Consider a variant of the situation, in which the challenger is constrained by its financial war chest, which allows it to survive at most T 2 fights. Specifically, consider the game that differs from the one in part a only in that the history in which the challenger enters, in each of the following T 2 periods the incumbent fights and the challenger stays in, and in period T 1 the incumbent fights, is a terminal history (the challenger has to exit), in which the incumbent s payoff is M (it is the only firm in the industry in the last period) and the challenger s payoff is f. Find the equilibria of this game. 13. (Divide and Choose) Two players use the following procedure to divide a cake. Player 1 divides the cake into two pieces, and then player 2 chooses one of the pieces; player 1 obtains the remaining piece. The cake is continuously divisible (no lumps!), each player likes all parts of it, and each player cares only about the size of the piece of cake she obtains. a) Find the equilibrium; b) Come up with two different utility functions (one for player 1 and one for player 2) so that the equilibrium is different than the one from part a; c) Come up with one utility function (the same one for each player) so that the equilibrium is different than the one from part a. 14. Model the interaction between a burglar and a person who has a house in an isolated area. The house is always empty, but there are some valuable objects inside it. The house owner chooses first whether to install an alarm system in the house or not. The burglar can see whether the system has been installed or not, and he then chooses whether to break into the house or not. The alarm system is not free. Installing the alarm system, however, means that with some positive probability the burglar will be caught and sent to jail. These are the assumptions of the model that have to be met. Feel free to add additional ones if you think they make the model more interesting. Also, feel free to assign payoffs the way it makes more sense to you. Analyze the game and verbally explain the results.