Games and decisions in management Dr hab. inż. Adam Kasperski, prof. PWr. Room 509, building B4 adam.kasperski@pwr.edu.pl Slides will be available at www.ioz.pwr.wroc.pl/pracownicy Form of the course completion: exam.
Literature Decision situations (examples) 1 R.D. Luce, H. Raiffa. Games and decisions. Introduction and critical survey. Dover Publications Inc.. New York 1957 2 R. Myerson. Game Theory. Analysis of conflict. Harvard University Press 1997. 3 H. Peters. Game Theory. A multi-level approach. Springer 2008. 4 M. Osborne, A. Rubinstein. A course in game theory. MIT 1994. 5 H. Gintis. Game Theory evolving: A Problem-Centered Introduction to Modeling Strategic Interaction (Second Edition). Princeton University Press 2009. 6 Algorithmic game theory. N. Nisan, T. Roughgarden, E. Tardos, V. Vazirani (eds.). Cambridge University Press 2007
Classification of decision situations In a decision situation we have a set A of alternatives (called also actions, strategies, solutions, moves etc.) and we have to choose exactly one of them. For example: 1 A={invest x dollars: 0 x 1000} 2 A={go to the cinema, stay at home} 3 A=set of routes from city A to city B Choosing an alternative leads to an outcome (called also a consequence) from set O. For example: 1 O={-100$,0$, 100$, 1000$} 2 O={win, lose, draw} 3 O={life, death} Decision theory tries to say us which alternative from A should we choose.
Classification of decision situations 1 Decisions under certainty. Each alternative is known to lead invariably to a specific outcome. 2 Decisions under risk. Each alternative leads to one of a set of possible outcomes, each outcome occurring with a known probability. 3 Decision under uncertainty. Each alternative leads to one of a set of possible outcomes but the probabilities of the outcomes are unknown.
Classification of decision situations 1 Individual decision making. Decision situation involves 1 person, who would like to maximize (minimize) his/her payoff or utility. 2 n-person game. Decision situation involves n > 1 persons (called players), who would like to maximize (minimize) their payoffs or utility. The goals of the players are typically conflicting. We distinguish two important cases: 1 Noncooperative case. Each player makes his decision independently on the decisions made by other players. No communication between players is allowed. 2 Cooperative case. The players can communicate and make binding agreements.
Individual decision making under certainty A decision maker has a directed network G = (N, A) with costs c ij specified for every arc (i, j) A. He would like to find a cheapest path between two distinguished nodes s and t in G. Most of the classical operations research models belong to the class of individual decision making under certainty.
Individual decision making under risk Suppose that you have to choose one of the three possible bets A 1, A 2, A 3. These bets lead to the following outcomes: 1 A 1 : you loss -100$ with probability 0.4 and win 1000$ with probability 0.6 2 A 2 : you win 0$ with probability 0.2 and 100$ with probability 0.8 3 A 3 : you win 10$ with probability 1. Which bet will you choose? The von Neumann - Morgenstern utility theory (presented later) will help us to solve such problems.
Individual decision making under uncertainty [Luce and Raiffa 1957] Suppose that you have just broken 5 eggs into a bowl and you have to break a sixth one. The sixth egg may be good or rotten and you have no idea, which state is true. You can make one of the following three decisions: A 1 - break the egg into the bowl, A 2 - break the egg into saucer, A 3 - throw the egg away. The problem can be represented as the following table: Good egg Rotten egg A 1 Six-egg omelet No omelet, five good eggs destroyed A 2 Six-egg omelet and Five-egg omelet and a saucer to wash a saucer to wash A 3 Five-egg omelet and Five-egg omelet one good egg destroyed Which decision will you make?
2-person nonccoperative zero-sum game [The battle of the Bismarck Sea] The Japanese general Imamura has to transport troops across the Bismarck Sea to New Guinea, and the American general Kenney wants to bomb the transport. Imamura has two possible choices: a shortest Northen route (2 days) or larger Southern route (3 days), and Kenney must choose one of these two routes to send his planes. If he chooses the wrong route he can call back the planes and send them to the other route, but the number of bombing days is reduced by 1. The number of bombing days for each pair of decisions is shown in the following table: Imamura North Imamura South Kenney North 2 2 Kenney South 1 3 What would you do if you were Imamura or Kenney? This is a zero-sum game. The payoffs of both players are just opposite.
2-person noncooperative game [Prisoner s dilemma] Two prisoners are on trial for a crime and each one faces a choice of confessing the crime or remaining silent. If they both remain silent, then they will both serve a short prison term of 2 years. If only one of them confesses, then his term will be reduced to 1, while the term of the second prisoner will increase to 5. Finally, if both confess they ill both get a small break for cooperating and will have to serve prison sentence of 4 years each. Confess Silent Confess (4,4) (1,5) Silent (5,1) (2,2) What should the prisoners do if they cannot communicate?
2-person cooperative game [Battle of sexes] A man, player 1, and a woman, player 2 each have two choices for an evening s entertainment. Each can either go to a boxing fight or to a ballet. The man much prefers the fight and the woman the ballet. However, to both it is more important that they go out together than that each see the preferred entertainment. The situation is shown in the following table, where the numbers denote the utilities for every combination of decisions: Woman choose fight Woman choose ballet Man choose fight (2,1) (-1,-1) Man choose ballet (-1,-1) (1,2) What is a solution to this game? Consider the situations when the players can or cannot cooperate.
n-person cooperative game Cities 1, 2 and 3 want to be connected with a nearby power source. The possible transmission links and their costs are shown in the picture. Each city can hire any of the transmission links. If the cities cooperate in hiring the links, then they save on the hiring costs (the links have unlimited capacity). What is a solution to this situation if the firms do not cooperate? Suppose that all three firms can cooperate. What is then a solution?
Example Decision situations (examples) [Gintis 2009] Big Monkey and Little Monkey see a large fruit on a tree. To get the fruit at least one of the monkeys must climb the tree, shake the branch vigorously until the fruit falls down the ground. The fruit is worth 10Kc of energy. The cost of climbing the tree is 2Kc for Big Monkey but is negligible for Little Monkey. The payoff (in Kc) matrix is as follows: Consider the situations when: BM wait BM climb LM wait (0,0) (4,4) LM climb (1,9) (3,5) 1 Big Monkey decides first to climb or not. 2 Little Monkey decides first to climb or not. 3 Both Monkeys decide simultaneously.
Example Decision situations (examples) Big Monkey decides first
Example Decision situations (examples) Little Monkey decides first
Example Decision situations (examples) Both monkeys decide simultaneously It does not matter which monkey is in the root of the tree.
Example [Myerson 1997] At the beginning of the game, players 1 and 2 each put a dollar in the pot. Next, player 1 draws a card from a shuffled deck, which half the cards are reds and half are black.
Example [Myerson 1997] Player 1 looks at his card privately and decides whether to raise or fold. If he folds, then he shows the card to player 2 and the game ends. In this case player 1 takes the money in the pot if the card is red, but player 2 takes the money in the pot if the card is black. If player 1 raises, then he adds 1 dollar to the pot and the next move has player 2.
Example [Myerson 1997] Now player 2 must decide whether to meet or pass. If player 2 passes, then the game ends and player 1 takes the money in the pot. If player 2 meets, then he adds another dollar to the pot, and then player 1 shows his card. Again, player 1 takes the money in the pot if the card is red, and player 2 takes the money in the pot if the card is black.
Example [Myerson 1997] The crucial fact in this game is that player 1 knows the color of the card but player 2 does not. So, player 1 exactly knows if he is at node 1.a or 1.b. But player 2 cannot distinguish the two his nodes. The dotted circles denote the information states of the players. Remark: Information states containing 1 node are typically not marked.
Example Decision situations (examples) This is a representation of quite different game. What are the rules of this game?
Game in extensive form A game in extensive form is a rooted tree, where: 1 A node having label 0 is a chance node and a node having label i {1,..., n} represents a decision node controlled by player i. 2 There is a probability distribution over the branches of each chance node. 3 Every node that is controlled by a player has a second label that specifies the information state of the player at this node. 4 Each alternative at a node controlled by a player has a move label. All nodes belonging to the same information state have the same alternatives and move labels. 5 Each terminal node has a vector (u 1,..., u n ) representing the numerical payoffs o all players if the game ends in that node.
Games with perfect information If no two nodes have the same information state, then we say that the game has perfect information. In such a game every player knows exactly in which node of the game tree he is. Examples of two-person games with perfect information: Chess, Tick tack toe, Reversi, Hex, Go
Example Decision situations (examples) [Chomp] Two players have a bar of chocolate with m n squares. The square in the top left corner is known to be poisonous. The players play in turn, where the rules are as follows: a player chooses one of the (remaining) squares of chocolate. He then eats this together with all the pieces which are below and to the right of the chosen one. The player who has to eat the poisonous piece loses.
Example Decision situations (examples)
Strategy Decision situations (examples) A stategy for ith player is any rule for determining a move at every information sate of this player. A strategy describes a complete plan of playing the game for a player at every possible situation. A sample strategy (FR) for player 1: if the card is red, then fold and raise otherwise. Player 1 has exactly four strategies: FR, FF, RF, RR.
Strategy Decision situations (examples) A sample strategy (M) for player 2: he meets. Player 2 has exactly two strategies: meet (M) or pass (P).
Strategy Decision situations (examples) If there are no chance moves, then after adopting strategies by all players the game tree reduces to a single path from starting node to some terminating node.
Strategy Decision situations (examples) If there are chance moves in the game, then after all players adopt their strategies there may be more than one possible path of the game leading from starting node to terminating nodes. Every such a path has some probability determined by chance moves. For strategy pair (FR,M) there are two possible paths of the game each of which can occur with probability 0.5. Hence the player s 1 expected payoff is 0.5*1-0.5*2=-0.5 and the player s 2 expected payoff is -0.5*1+2*0.5=0.5
Games in normal form We will consider a game with 2 players (for more players the reasoning can be easily generalized). Given a game in extensive form, we can list all strategies S 1 = {α 1, α 2,..., α k } of player 1 and all strategies S 2 = {β 1, β 2,..., β l } of player 2. Let a ij be the expected payoff for player 1 and let b ij be the expected payoff for player 2, if player 1 adopts strategy α i and player 2 adopts strategy β j. This game can be represented as the following table: β 1 β 2... β l α 1 (a 11, b 11 ) (a 12, b 12 )... (a 1l, b 1l ) α 2 (a 21, b 21 ) (a 22, b 22 )... (a 2l, b 2l )............... α k (a k1, b k1 ) (a k2, b k2 )... (a kl, b kl )
Games in normal form M P RF (0.5,-0.5) (0,0) RR (0,0) (1,-1) FR (-0.5,0.5) (1,-1) FF (0,0) (0,0)