Image: Russian military honor guard/Wiki
The Ukrainian crisis is officially below-the-fold now, as what began with large-scale actions—the appearance of Russian troops in Ukraine, the annexation of Crimea—becomes day-to-day, piece-by-piece escalation toward actual war as Ukrainian forces spar with pro-Russian separatists: downed helicopters, gunfights over checkpoints and bridges, kidnappings, and the apparent torpedoing of a Russian/Ukrainian de-escalation agreement. All in all, it's a bad scene and it's difficult to come up with a "winning" situation for anyone involved, no matter the outcome.
What would be rational for the various different players in the current crisis? What reduces violence, economic harm, and regional instability? These are considerations that would seem to demarcate rationality in Ukraine, though the direct players are so far more interested in other, less rational things, like nationalism, stand-making, ethnic dividing, and rhetoric. The conflict's news cycle might be becoming more atomized and routine—more of a map than a headline—but game theory (and psychology generally) tell us that we're at the absolute critical moment for a rational outcome, if not past it already.
Escalation, generally, is the realm of cognitive bias, e.g. a mode of thinking that skews from good judgement and rationality. Judgement becomes fixated on past outcomes rather than future outcomes: I have already done this, therefor I should do this, rather than, If I do this, the outcome will be bad. It's an ego thing, an overpowering resistance to being seen as a failure. If you had x number of dollars to spend on an outcome and you had a choice between two different outcomes, one better and one worse, rather than spend the money on the better outcome, you would be more likely to spend money on whichever outcome you'd already spent money on, even if it's worse. This is called the "sunk-cost effect" and it's sort of the essence of escalation, whether it appears during the run-up to war or in a casino.
An interesting example of the effect is seen in the NBA. A study released in 1995 courtesy of some Cornell researchers looked the relative draft cost of different basketball players (first pick being the most expensive and so on) and how much time they spent on the court playing in games vs. time spent on the bench. Rational decision making would have the most productive (best) players getting the most on-court time, regardless of what pick they were in the draft, but the Cornell team found that draft pick rank was having a disproportionate effect on how much time players spent playing.
So, if some player was picked first in the draft but was doing badly in play, they might still spend more time playing vs. the player that got picked last but had proven themselves a better player. This is also called escalation of commitment, described first in a 1976 paper by Barry Staw. Staw summarized the general effect found in his corresponding study as such: "individuals invested a substantially greater amount of resources when they were personally responsible for negative consequences." It's a powerful notion: if you make a bad call, one way to rationalize it is to keep making the same bad call in the future, each one of those new bad calls serving to rationalize the first bad call.
That's a weird idea and it deserves another example. Say I bought a used car that turned out to be a piece of shit. I didn't have it looked at by a mechanic before buying it or even check the dipstick for dirty oil. I kicked the tires and slapped down a few grand cash for the keys. The car starts breaking right away, every time costing more and more money and before long I've put more money into the car than the car cost. Every time the car breaks, I'm presented with the option to sell it for scrap but every time I fix the car again. Every one of those decisions I make is haunted not by the future costs of fixing the car, but by the fact that I bought a lemon. Self-rationalization kicks in—it's only a lemon once I stop fixing it because you don't fix lemons. Every decision I make to fix the car rationalizes the bad choice to buy the car: repeating a bad decision makes it feel less like a bad decision.
You see something similar in gambling, but with an additional feature. It's called the "gambler's fallacy": the mistaken belief that odds are cumulative or there's some natural balance in the world that will turn in your favor if you lose enough. That is, if you're on a losing streak (like losing on red five times in a row in roulette) you might misinterpret that as increasing the odds of the wheel falling on red the next time, even though every spin of the wheel is governed by odds completely independent of previous spins. You can see a bit of that in most any escalation: if the situation's bad enough then it will probably turn good. Unfortunately, karma is bullshit.
The Staw paper quotes a chilling exchange between Under Secretary of State George Ball, "Vietnam's devil's advocate," and President Lyndon Johnson that occurred in 1965 (revealed in 1971 by the Pentagon Papers):
Once large numbers of U.S. troops are committed to direct combat, they will begin to take heavy casualties in a war they are ill-equipped to fight in a non-cooperative if not downright hostile countryside. Once we suffer large casualties, we will have started a well-nigh irreversible process. Our involvement will be so great that we cannot--without national humiliation--stop short of achieving our complete objectives. Of the two possibilities, I think humiliation would be more likely than the achievement of our objectives--even after we have paid terrible costs.
So, every troop death in Vietnam would become an argument for another troop to be sent into combat. The rational alternative—de-escalation—would have meant acknowledging the initial bad decision to get involved, whereas sending more troops would validate that decision. That's escalation.
This reduces to a very simple and illustrative game, called the dollar auction. A number of players bid on a dollar, just a plain old dollar bill, in increments of five cents. The rules have one twist, that the second place loser still has to pay the money they bid; they just don't get the dollar. This adds loss to the equation and we can imagine what happens. Eventually, everyone loses except the auctioneer as the players' bids turn from trying to make a profit (the difference between the dollar and their bid) and trying to lose less money. That's also escalation.
The Ukrainian conflict is costing all sides, and all sides have a history of losing bets. Is it hopeless? No. All of this game theorizing has done some good and provides some ways out. One solution is to seperate the decision makers in the conflict from those that evaluate the decisions. Putin can decide to send troops into the eastern Ukraine, but he has to make his next decision based on someone else's evaluation of the first.
This idea was demonstrated in a study of loans made by a selection of California banks. If a bank manager made a bad loan to a customer and that customer defaulted, they were likely to make a second loan to the customer because of all of the stuff I talked about above. However, when bank manager turnover increased to a certain point, the overall rate of bad loans decreased because it was more likely that a second, different bank manager was making the decision on the follow-up loan, a bank manager untainted by the first bad decision.
Probably more applicable in our current situation is the notion of reversability. The conflict's various parties need to be given ways out that allow their previous decisions to remain valid in some way without doubling down on them. In this way, increasing sanctions and military pressure on Russia and pro-Russian separatists just reinforces the wrongness of the Russian decisions that have been made until now, which reinforces the escalation. The pro-separatist side is further trapped into escalation and NATO (et al) is itself forced into more escalation. That's how these things go, and at some point one side will have bet more than a dollar. That's when things get really bad.