The St. Petersburg paradox is a paradox discovered by James Bernoulli in the field of game theory. Consider a game in which
a coin is flipped until heads comes up. If it comes up on the
first toss, you win $1. If it comes up on the second toss, you
win $2. If it comes up on the third toss, you win $4. In general,
if it comes up on the n^{th} toss, you win $2 ^{n-1}.
The expectation (average amount you can expect to win) for this
game can be found by adding together the products of the probabilities
for each outcome with the amount that is won for each outcome.
This gives us

(1/2) × 1 + (1/4) × 2 + (1/8) × 4 + (1/16) × 8 + . . .

= ½ + ½ + ½ + ½ + . . .

Since the number of terms in this sequence is infinite, the sum
is also infinite. Therefore, the expectation for this game is
infinite. The paradox lies in the fact that it doesn't make sense to have an
infinite expectation, since it is only possible to win finite amounts of money.

To help to explain this conclusion, note that the infinite expectation does represent the mean amount that you would win per play if you were to play an infinite number of times, but it doesn't accurately represent any possible outcome from a single play. Furthermore, virtually the entire sum comprises terms that represent very unlikely events for which you'll likely need to play for a very long time to be able to take advantage of. As you keep playing, the average amount that you win will continue to increase without end, but it will do so very, very slowly.

**Something to Try:** You don't have to try this, but it's highly instructive.
Write a computer program to simulate this game. Extend it to play a large number of games (try 1,000, 10,000, 100,000, and 1,000,000) and output the average payout. Then, extend it to simulate 100 series each of 1, 10, 100, 1,000, 10,000, 100,000, and 1,000,000 games. For each 100-series set, find the *median* of the mean amount won. Plot a graph of length of game versus median of mean amount won. If possible, plot the points on a logarithmic scale. Once you've finished doing so, come back and continue reading.

If you did the exercise above, you may have noticed that the average amount won increases logarithmically with the number of times played. This amount appears to be about 3 + log_{2}`p`, where `p` is the number of plays. The limit of a logarithmic function as it approaches infinity is infinite, but it approaches infinity much more slowly than other functions. So, to obtain an average win of, say, $25, based on the above, you'd have to play 2^{22} times (over four million!). That's a lot of times, and you're still not really making that much, on the average, per play.

This "long wait" is one of the ways of resolving this paradox. If you wait "long enough," any losses incurred will eventually be recovered through a long streak of tails. However, for even a constant bet of $12.50, the expected time that it will take to recover the money is very long. To play over four million times at one play per minute, you would need almost eight years. The human lifespan just isn't long enough to ever be able to obtain long streaks of heads that are needed for the big payoffs. You could flip coins for the rest of your life and never get, for example, 30 heads in a row. Bonus question: If you flipped coins at the rate of 1 per second without stopping, how long would it take to get 30 heads in a row, on average?

This leads to another consideration. While, in theory, this game is very profitable, it fails to be when real-world limitations are applied to it. If you're playing at a stake of $12.50 per toss, you might have to spend $100,000,000 and (as already mentioned) wait eight years before you start to break even. Before that point, most people will have lost all of their capital and will have to quit before getting that lucky break. Furthermore, in real life, the person running the game won't have an infinite amount of money to pay out. Assuming that he's only able to pay a prize of, say, $1 million, the expectation for the game drops from infinity to about $10.

Even if he could pay tremendous sums of money, and you lucked out and got an enormously long string of tails, the astronomical sums of money would lose their meaning. While $1,000,000,000,000,000,000 (one quintillion dollars) is 1,000 times greater than $1,000,000,000,000,000 (one quadrillion dollars), to most people these are just incomprehensible sums of money. Besides, what could you buy for $1,000,000,000,000,000,000 that you couldn't buy for $1,000,000,000,000,000? This is related to the economic concept of *utility*. This is how Bernoulli explained the paradox; the more money you have, the less each additional marginal amount is worth to you. For example, most people will find the difference between having $0 and $10,000 much greater than having $1,000,000,000 and $1,000,010,000, even though the difference is $10,000 in each case. When you take into account the fact that large wins aren't as useful as they might seem numerically, the game's expectation drops precipitiously. This idea of utility also reflects the difference between money and wealth: Wealth is the capacity to acquire "stuff" that you need or want, and if you have a quintillion dollars, you don't have much more wealth than if you have a quadrillion dollars, because there's nothing you couldn't buy for a quadrillion.

See also: Two Envelopes paradox.

Sources used (see bibliography page for titles corresponding to numbers): 53.