63 pages • 2 hours read
Avinash K. DixitA modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.
Summary
Chapter Summaries & Analyses
Key Figures
Themes
Index of Terms
Important Quotes
Essay Topics
Tools
People can sometimes find themselves in situations where an opponent is trying to defeat them and where winning is vital. Businesses must compete in the marketplace; politicians must win elections; football coaches must win games; parents sometimes need to outwit their own children; nations with nuclear weapons have to strategize carefully.
Thinking Strategically presents a solution called game theory, “The science of strategic thinking” (ix). Barely 50 years old in 1991, game theory began as the province of scientists and mathematicians. The book presents game theory for everyone, with case studies and examples from a wide variety of fields, including sports, movies, and books, without all the jargon and math.
People constantly make decisions—“how to manage a business, whom to marry, how to bring up children” (1)—and these decisions interact with decisions made by others. If those people resist your decisions or compete to overcome you, you must anticipate the competition and either defeat it alone or find allies to help you. Working these things out is called strategy.
Game theory looks at a given situation, evaluates the strengths and weaknesses of the opposing sides, and applies basic principles to develop a plan that can win the contest.
Each chapter gives examples from various walks of life, then concludes with a “case study” that invites the reader to think up solutions. Chapter 13 includes 23 case studies, from easy to difficult, for extra practice.
“1. The Hot Hand” (7). Many fans believe players go on scoring streaks. Researchers, though, find that sports streaks are random, like streaks of heads during coin flips. In fact, a study of NBA players found that a successful shot tended to be followed by a missed shot, and a missed shot was more likely to be followed by a successful shot. A player who makes his shot gets added attention from the defense, which makes his next shot harder; meanwhile, another shooter will be more open and more likely to make a shot. Thus, one made shot still can lead to better scoring overall for the team.
This applies to other sports as well. A great running back forces the defense to focus on him, which opens up opportunities for his team’s passing game; a great striker in soccer, crowded by defenders, can pass to an open player for the score. Assists can be as important as goals.
“2. To Lead or Not to Lead” (10). In the America’s Cup 1983 finals, the American sailing team led 3-1 and got off to a 37-second lead. They then ignored the Australian boat, sailing a different course. The Australians won and came back to take the trophy. The Americans should have copied the follower’s strategy; in two-boat races, copying keeps leaders in front.
This applies to other fields. Famous stock pickers often later back off and follow the crowd, while newcomers must try something risky to get attention. IBM tends to copy innovators and mass-produce their successful ideas to its huge customer base.
“3. Go Directly to Jail” (11). The “prisoner’s dilemma” is the most famous game of strategy. If, for example, two suspects in a crime deny all charges, they’ll each get three years in prison on conviction, but if one suspect instead confesses and blames the other, he’ll get only one year while the other suspect gets 25 years. If both confess, they each get 10 years. Thus, it’s better to confess as long as the other doesn’t. Even if they agree ahead of time not to confess, each will suspect that the other is tempted to rat them out, and the agreement collapses.
When nuclear nations negotiate disarmament treaties, they promise to remove their weapons but often secretly retain some of them because they suspect the other side is doing the same. Union-management negotiations—with the threat of strikes set against getting a bad bargain—have similar problems. In the prisoner’s dilemma, both sides can win, but the temptation to cheat often ruins those chances.
“4. Here I Stand” (14). Martin Luther, when he broke from the Catholic church, refused to bend in the face of threats. This led to the Protestant Reformation and big changes in religious practice in Europe. French leader Charles de Gaulle refused to budge on many of his positions and often won major points during World War II and the postwar years. Luther and de Gaulle used “the power of intransigence” (15). This take-it-or-leave-it attitude can force the other side to give in or lose everything, but it can make future negotiations harder for the stubborn party, ruining friendships and alliances. Stubbornness can overcome a difficult problem, but it wastes time and resources when battling an impossible one.
“5. Belling the Cat” (16). As with the mice who want to put a noisemaker on their enemy the cat but don’t know how to avoid getting eaten while doing so, large groups sometimes are dominated by small ones because it’s hard to organize a coordinated attack against the oppressor. This might be called the “hostage’s dilemma.” A taxi dispatcher can extort payment from drivers, and whoever refuses always gets a clunker to drive. The cars don’t improve but the dispatcher gets rich.
“6. The Thin End of the Wedge” (19). If shoemakers alone were at risk from foreign imports, banning that competition would raise consumer costs about $4 per person, but if every business threatened by offshore competitors wanted protection, it would cost consumers $200 per person. The 1986 US tax reform legislation foundered when every special-interest group got an exemption, but it passed when every senator agreed to vote against all amendments to the bill. Such problems get better solutions when the players look farther into the future.
“7. Look Before You Leap” (21). Commitments—a new house, a new computer system, a marriage—can be costly to undo. Once a commitment is made, “your bargaining position is weakened” (21). An important workaround is to bargain for more benefits before committing.
“8. Mix Your Plays” (22). In sports, players don’t do the same things over and over; they vary their plays—now running the football, now passing; now hitting the tennis ball to the opponent’s backhand, now across court—and do so in unpredictable patterns. This prevents the opponent from predicting and counteracting each move. If the US tax service audited people’s tax returns in a predictable pattern, taxpayers would learn when to cheat and when to be honest. Companies that offer predictable discounts can be countered beforehand by competitors. For each such situation, though, there’s a way to figure out just how much randomness to apply.
“9. Never Give a Sucker an Even Bet” (24). If someone offers to bet you on some proposition—whether a baker sells more strudel than cheesecake—they know something you may not, and you’ll probably lose. The secret is to offer odds on the likelihood of one of the outcomes. This improves your chances; it’s done in sports betting. In financial markets, the spread between buying and selling prices, when known, shifts until each side has a reasonable chance on the outcome.
“10. Game Theory Can Be Dangerous to Your Health” (25). After a conference, two American economists took a taxi to their hotel in Jerusalem. The driver switched off the meter and told them he’d charge them less than the usual price. At the hotel, he asked for 2,500 shekels. The Americans, following the Israeli practice, countered with 2,200 shekels. The driver, outraged, drove them back to the conference. There, they found another taxi whose driver kept the meter on and charged them 2,200 shekels. The economists should have waited till they got to the hotel, exited the taxi, and then begun to bargain.
“Case Study #1: Red I Win, Black You Lose” (28) concerns author Nalebuff’s college graduation party at Cambridge, England, which featured a casino with a $20 buy-in and a free ticket to the next big party for whoever had won the most money at the casino. Barry was ahead $700; in second place was a woman with $300. The final game was Roulette, where the woman couldn’t beat Barry on an even-money bet, so she put her entire winnings on the ball landing on a number divisible by three, a one-in-three bet that paid three to one.
In this situation, whoever bets second has the advantage. The woman bet first. Barry’s best response was to match her bet with $300, holding back $400 so that, even if she wins the one-in-three bet, then so does he, and if they lose, he’s the last person standing with any winnings. Sadly, he was too drunk to figure this out; instead, he put $200 on an even-money bet against the one-in-five chance that she’d win her bet and he’d lose his. As luck would have it, she won.
Every year, Lucy offers to hold a football while Charlie Brown tries to kick it. At the last moment, Lucy pulls the ball away and Charlie Brown lands on his back. He should be able to predict, based on Lucy’s personality and previous behavior, that she might try this; accepting her offer is “remarriage, a triumph of hope over experience” (31).
In strategic games, interactions between players either are simultaneous, as in the prisoner’s dilemma when both players must act at the same time, or sequential, as when Lucy holds the ball and then Charlie Brown tries to kick it. Some games contain both.
The first rule of sequential competitions is to anticipate what the opponent will do and think up a counter-move, then guess their response and your next counter-move, and so on into the future. This can get complex; a “tree diagram”—with lines branching off from each decision like branches on a tree—helps keep strategic planning orderly. For example, to travel from Princeton to New York, travelers can set up a “decision tree,” and on it they list the available modes of travel, such as a train, bus, or car, then the available routes for each mode of travel, then the modes and routes to the specific destination once they’re in the city. Each mode of travel has a line, or branch, on the tree diagram, and extending off each line is a route option, and so on.
If the tree diagram describes play options in a game with two or more players, it becomes a “game tree” in which each player must anticipate the other players’ likely moves and plan accordingly. For example, if Lucy offers to hold the football, Charlie Brown has two options: accept or reject. If he rejects, the branching process halts; if he accepts, two new branches appear at that location on the diagram, one for Lucy letting Charlie Brown kick the ball and one for her pulling it away.
The best way to read a tree diagram or game diagram is backwards, from the most-desired outcome to the branches that lead to it.
A vacuum maker, Newcleaners, wants to enter the market in Havana, which is dominated by the Fastcleaners company. Newcleaners strategists set up a tree diagram that gives Fastcleaners two options: accept the new competition, which results in Newcleaners earning $100,000, or wage a price war, which costs Newcleaners $200,000. Strategists then assign probabilities to either outcome. If, at a glance, the odds of either occurrence are 50%, they weight each outcome equally, add them up—the total is a loss of $100,000—divide the result by 2, and find the odds are that they’ll lose $50,000. The tree diagram thus tells them to stay out of Havana.
Newcleaners strategists can, however, improve their calculation of the odds by estimating how much Fastcleaners will earn or lose in each scenario. These numbers become new branches on the decision tree, with the better result for Fastcleaners getting higher odds. If Fastcleaners makes money by accommodating the new entrant and loses money in a price war, Newcleaners should slice off the price-war branch as highly unlikely and enter the Havana market.
Real-life situations usually aren’t this simple, so the decision tree can grow more complex. Chess can be tree-diagrammed, with each move leading to a series of branches that represent the opponent’s possible moves. The diagram grows quickly: The first player has 20 possible first moves and so does the second player, for a total of 400 possible branches on the diagram. Thus, a complete game is impossible for a human to analyze completely. Most series of moves lead to bad positions, though, so most decision-tree branches can be devalued; also, as pieces get removed, the game tree simplifies, and many chess endings are completely understood. Tic-tac-toe contains 15,120 possible moves, but most sequences are “strategically identical,” and smart players always can obtain a tie.
Reasoning backward from a desired ending can help in bargaining. If negotiations take too long, the value of the deal diminishes, as when children argue over an ice-cream pie: With each step in the negotiation, the pie deteriorates, so if one kid insists too much, this could cause the other to balk, so the pie melts and neither kid gets anything. The branching options quickly point toward a 50-50 split as the best first offer. If the number of negotiating rounds is even, this will be the likely outcome; strangely, if the number of negotiating rounds is odd, the first side to open the bargaining will get somewhat more of the pie.
Thinking ahead can affect war and peace. If one country attacks a weaker country, it’s likely to win unless a third neighbor, sensing weakness along its border with the first country, decides to attack the first country. A fourth country could become involved for similar reasons. With an odd number of players, the first one who might be attacked usually remains safe.
Sometimes, the order of play changes the advantage. In a political campaign, the first candidate to announce her strategy gives the second candidate a chance to counter it. If neither reveals their strategy, though, or if either changes strategy during the campaign, then the order of play becomes meaningless: Players must shift from sequential to simultaneous strategizing.
In “Case Study #2: The Tale of Tom Osborne and the 1984 Orange Bowl” (53), Coach Osborne’s #1-ranked Nebraska Cornhuskers faced the Miami Hurricanes and only needed a tie to win the national championship. Coming from behind, the Cornhuskers get a touchdown and trail by 31-23; they complete a safe 1-point kick to make the score 31-24. The Cornhuskers get another touchdown, and Osborne can tie the game—and win the national title—with a safe 1-point kick. Instead, he opts for the risky 2-point running conversion to win and doesn’t get it; Miami wins the game and the national title. Critics say Osborne should have taken the safe 1-point kick, but a tie just isn’t the same as a win for the national championship. What did Osborne do wrong?
Down by 14 points, Osborne knew he needed 2 touchdowns and 3 extra points for the win. He should have gone for the dangerous 2-point conversion first, when the score was 31-23; if that worked, their next touchdown would give them the six points needed to pull even at 31-31, and a safe 1-point kicking conversion would give them the win. The moral: “if you have to take some risks, it is often better to do this as quickly as possible” (55).
If two competitors make moves at the same time—for instance, the magazines Time and Newsweek releasing their covers simultaneously—each side must try to imagine what the other side is planning, how to counter those plans, and what the other side expects the counter-plans to be, and so on in an endless loop. This process can be simplified, though, with three rules that involve the principles of “dominant strategy” and “equilibrium.”
A dominant strategy is at least as good or better than all other strategies. One week, two big stories are in the news, an effective AIDS drug and a US budget impasse. Seventy percent of newsstand magazine buyers are interested in the AIDS story first, while 30 percent prefer the budget story. If both Time and Newsweek put the AIDS story on the cover, they’ll split newsstand sales and each get 35% of the market; if one magazine goes with the budget story, it will get all of the remaining 30% of the market. Even at its worst, the AIDS-story cover does better than the budget-story cover; thus, the AIDS cover is the dominant strategy.
A way to visualize these situations involves drawing a table, or matrix, with rows and columns, where the rows represent one team’s options, the columns show the other team’s options, and the numbers in the resulting boxes show the payoffs for each combination of options. If, for example, both Time and Newsweek opt to run the AIDS story, the square on the table where those choices intersect will contain the number 35, representing the 35% of readers that each magazine will likely get in that situation.
Things get more complex if one magazine is more popular. If Time is more popular, its dominant strategy remains the AIDS cover, but Newsweek, knowing this, often can do better for itself by putting the budget impasse on its cover. A dominant strategy, then, is one that dominates, not the other players, but all other strategies for a given player.
“If you have a dominant strategy, use it” (66). Also, when you know the other player’s dominant strategy, plan for it. Dominant strategies work if players move simultaneously, but in sequential play only the second player can rely on a dominant strategy, since the first player can’t know for sure the opponent’s response.
Dominant strategies aren’t very common, but another way to simply the decision-making process is to avoid all “dominated” strategies, those that lose no matter what: “Eliminate any dominated strategies from consideration, and go on doing so successively” as the opponent’s tactics change (69).
In simultaneous competition, each side’s reasoning may converge to an “equilibrium”: “Having exhausted the simple avenues of looking for dominant strategies or ruling out dominated ones, the next thing to do is to look for an equilibrium of the game” (76). If, for example, Time costs more but Newsweek raises its price, Time answers with a smaller price rise; this continues until they converge at the same price, an equilibrium.
Some games have more than one equilibrium point. Driving can be done either on the right or left side of a street, depending on the country. These situations sometimes need a common understanding beforehand. If, for example, you’re told to meet someone in New York City but you’re not told where, your best bet is Grand Central Station, since it’s a commonly known transit location.
“Case Study #3: Tough Guy, Tender Offer” (81) describes how a corporate raider offers to buy the shares of a company, currently valued at $100 each, in two tiers, one at $105 per share for the first 50% of the stock outstanding, then $90 per share for the rest. Another raider offers $102 per share if that raider gets a majority of all shares outstanding. To which raider should the stockholders sell?
The tiered offer, though it offers less money per share overall, is the dominant strategy. If you tender your shares but the raid fails to obtain 50% of the stock, you get $105 per share that you sold. If the raid succeeds, you get somewhere above $97 per share and probably more. If you don’t tender and the raider gets more than 50%, thereafter you only receive $90 per share. Realizing that most stockholders will decide to sell to avoid the $90 price that kicks in when the majority of the stock changes hands, most stockholders will tender their shares. The raider gets the company below market price.
This is a case where the dominant strategy isn’t the winning strategy but the least-bad strategy and an equilibrium strategy as well. Even if the opposing raider makes the $102 offer unconditional, most stockholders will still choose the $105 offer, thinking that the two-tiered raid will fail and what stocks it does buy will pay off at $105. Either way, the two-tiered raid obtains a majority of the company.
Part 1’s Epilogue briefly summarizes the main points discussed in Part 1. Examples of game theory in action presented in Part 1 will be explained further in upcoming chapters.
The first three chapters explain the basics of game theory and present several common situations where game theory can be applied to improve a participant’s outcomes.
Much of the purpose of game theory is to simplify the sometimes-complicated interactions between people who compete against each other during the many situations in life where more than one person is trying to acquire a resource.
Game theory breaks down masses of possibilities into simpler categories: Some games are sequential, and some are simultaneous; sequential games can be charted using a tree diagram, and simultaneous games can be organized into tables; some strategies are dominant and should always be used, while others are dominated and should never be used; some situations settle into an equilibrium where no further strategic play is required. These theories can become rules of thumb, handy mental tools for parsing complex competitive situations.
Tree diagrams can get pretty complicated, with branches branching from branches until the chart looks as complex as an entire tree. Without such a diagram, though, trying to devise strategies with dozens or even hundreds of alternative future paths can quickly become impossible to visualize, much less compute. A tree diagram simplifies things so that a player can follow the lines that lead to various strategic alternatives without getting lost.
The authors define a tree diagram and thereafter usually refer to it as a “decision tree,” which also means a tree diagram used in single-player activities. This might cause some confusion, but the important point to remember is that there are two basic types of tree diagrams, one for single players who are simply trying to plan a trip or organize a day’s activities—a “decision tree”—and one for multi-player competitions—a “game tree.”
In Chapter 2, the Newcleaners vacuum company makes a “decision tree” and not a “game tree” because they don’t yet know whether their potential opponent, Fastcleaners, will engage them in battle. Once the battle is joined, the competitive options on the tree diagram convert into a “game tree,” but the process of designing the diagram remains essentially the same.
Chapter 3’s case study discusses how a winning branch of one game tree got ignored in 1984 when the Nebraska football team opted for a 2-point conversion on the last play and lost. Late in that national-championship game, Nebraska coach Tom Osborne needed two touchdowns and three extra points to win. The authors argue that he should have solved his extra-point dilemma by taking the big 2-point risk after the first of those touchdowns, so that, on the last big score, his team would have more scoring options.
This strategy presupposes, of course, that there would be a next touchdown if Osborne altered his strategy. After all, an earlier, riskier 2-point conversion that failed might have changed the tenor of the game, and there might have been no second touchdown for Nebraska. The authors argue against this, but such what-ifs are ultimately imponderable: We’ll never know for sure what might have happened. Still, the authors’s point is well taken, that risks generally should be handled early if possible.
In short: There are two types of game, simultaneous and sequential. For sequential play, use a game tree to develop a strategy and read it backwards; for simultaneous play, use tables. In all cases, if you have a dominant strategy, use it; if you find a dominated strategy, lose it. Search for equilibria where you can rest and find chances to cooperate.
Appearance Versus Reality
View Collection
Asian American & Pacific Islander...
View Collection
Books About Leadership
View Collection
Business & Economics
View Collection
Community
View Collection
Politics & Government
View Collection
Psychology
View Collection
Science & Nature
View Collection
Self-Help Books
View Collection
Trust & Doubt
View Collection