Math Lair Home > Topics > Two Envelopes Paradox

The two envelopes paradox, also known as the two envelopes problem, is a paradox in probability. Consider the following scenario:

Two envelopes containing money are put before you and you are invited to select one. You are told that one of the envelopes contains twice as much money as the other. You select an envelope. You are then given the option to switch your envelope for the other one. Should you switch?

We'll call the two envelopes A and B. Say that you choose envelope A. A will contain a certain amount of money (call it \$a). If you switch to envelope B, there are two possibilities:

• B has more money, in which case you gain an additional \$a.
• B has less money, in which case you only lose \$½a.

Since each possibility is equally likely, you will gain, on average, an additional ½(a − ½a) = \$¼a by switching. So, you should switch to B. However, if you chose B initially, by the same reasoning, you should switch to A. But how can envelope B be better than envelope A and, at the same time, envelope A be better than envelope B? This is the paradox.

There are two versions of the paradox. In the first version, you are asked whether you want to exchange envelopes before you open your envelope. In the second version, you are asked after you open your envelope.

The first version of the paradox can be resolved fairly easily. Instead of looking at the amount of money in the chosen envelope as fixed, the correct way of looking at the problem is to look at both envelopes together. One envelope contains, say, x dollars and the other contains 2x dollars. The expectation if you keep your chosen envelope is 1.5x dollars, and the expectation if you switch your envelope is also 1.5x dollars. Looking at it this way, there's no advantage to switch and the paradox is resolved.

The second version, where you look inside your chosen envelope before deciding whether to switch or not, is a little more difficult to resolve, because, in this case, the amount of money in the chosen envelope is fixed.

To explain this paradox, it helps to remember that, in real life, the amounts of money in the envelope are going to be constrained in some manner. As an example, say that the envelopes may only contain whole numbers of dollars from \$1 to \$100, and each possibility is equally likely. So, the only possible amounts of money in each envelope are \$1, \$2, \$3, \$4, ..., \$97, \$98, \$99, \$100. There are now some amounts of money for which switching results in a negative expectation. For example, if you open an envelope and find \$52, the other envelope must contain \$26, because it can't contain \$104. Adding up the expectations for each possibility, there is no benefit to switching overall.

Even without any formal constraint on the monetary values, it's more likely in real life that you'll see smaller numbers than larger numbers; in other words, it's more likely that the amounts of money are distributed logarithmically than uniformly. If the amounts of money are distributed logarithmically, and you open one envelope and find that it contains x dollars, it's twice as likely that the other envelope contains x2 dollars than it is that it contains 2x dollars. If this is the case, your expected gain by switching is \$0, and the paradox vanishes.

If you allow for a scenario in which monetary limits aren't constrained in any way whatsoever, you run into some of the problems encountered in the St. Petersburg paradox. If, for example, the amount of money in the envelopes were distributed uniformly from 1 to , then, before you open an envelope, the expectation for selecting either envelope is infinite. If you open one envelope and find that it contains, say, \$100, then it really is to your benefit to switch to the other envelope. It is just as likely that it contains \$200 as it does \$50, so you'll gain \$25, on average, by switching. However, this scenario doesn't reflect the way things are in real life. First, the initial infinite expectation doesn't really reflect the amount you'll gain by playing the game once. Second, there can't really be, say, one quintillion dollars in an envelope; there just isn't that much money in the world. Even if there were one quintillion dollars in an envelope, it wouldn't matter, for any practical purpose, whether you got ½ quintillion or 1 quintillion or 2 quintillion dollars; they're all just incomprehensibly large sums of money. This last point is related to the economic concept of utility. See also Doomsday Argument for some pitfalls in assuming a uniform probability distribution.