The St. Petersburg paradox is a paradox in the field of game theory. Consider a game in which
a coin is flipped until heads comes up. If it comes up on the
first toss, you win $1. If it comes up on the second toss, you
win $2. If it comes up on the third toss, you win $4. In general,
if it comes up on the nth toss, you win $2 n-1.
The expectation (average amount you can expect to win) for this
game can be found by adding together the products of the probabilities
for each outcome with the amount that is won for each outcome.
This gives us
(1/2) × 1 + (1/4) × 2 + (1/8) × 4 + (1/16) × 8 + . . .
= ½ + ½ + ½ + ½ + . . .
Since the number of terms in this sequence is infinite, the sum is also infinite. Therefore, the expectation for this game is infinite. The paradox lies in the fact that it doesn't make sense to have an infinite expectation, since it is only possible to win finite amounts of money. The infinite expectation does represent the mean amount that you would win per play if you were to play an infinite number of times, but it doesn't accurately represent any possible outcome from a single play. Furthermore, virtually the entire sum comprises terms that represent very unlikely events for which you'll likely need to play for a very long time to be able to take advantage of. As you keep playing, the average amount that you win will continue to increase without end, but it will do so very, very slowly.
Something to try:
Obviously, this is optional, but it's highly instructive. Write a computer program to simulate this game. Extend it to play a large number of games (try 1,000, 10,000, 100,000, and 1,000,000) and output the average payout. Then, extend it to simulate 100 series each of 1, 10, 100, 1,000, 10,000, 100,000, and 1,000,000 games. For each 100-series set, find the median of the mean amount won. Plot a graph of length of game versus median of mean amount won. If possible, plot the points on a logarithmic scale. Once you've finished doing so, come back and continue reading.
If you did the exercise above, you may have noticed that the average amount won increases logarithmically with the number of times played. This amount appears to be about 3 + log2p, where p is the number of plays. The limit of a logarithmic function as it approaches infinity is infinite, but it approaches infinity much more slowly than other functions. So, to obtain an average win of, say, $25, based on the above, you'd have to play 222 times (over four million!). That's a lot of times, and you're still not really making that much, on the average, per play.
This leads to another consideration that also appeared in the previous probability problem. In that problem, a strategy for winning games of chance, which worked perfectly in theory, failed when real-world limitations were applied to it. The same is true here. To play over four million times at one play per minute, you would need almost eight years. The human lifespan just isn't long enough to ever be able to obtain long streaks of heads that are needed for the big payoffs. You could flip coins for the rest of your life and never get, for example, 30 heads in a row. Bonus problem: If you flipped coins at the rate of 1 per second without stopping, how long would it take to get 30 heads in a row, on average?
Furthermore, in real life, the person running the game won't have an infinite amount of money to pay out. Assuming that he's only able to pay a prize of, say, $1 million, the expectation for the game drops from infinity to about $20. Even if he could pay tremendous sums of money, and you lucked out and got an enormously long string of heads, the astronomical sums of money would lose their meaning. While $1,000,000,000,000,000,000 is 1,000 times greater than $1,000,000,000,000,000, to most people these are just incomprehensible sums of money. Besides, what could you buy for $1,000,000,000,000,000,000 that you couldn't buy for $1,000,000,000,000,000? This is related to the economic concept of utility.