Mathematical fallacies are errors, typically committed with an intent to deceive, that occur in a mathematical proof or argument. A fallacy in an argument doesn't necessarily mean that the conclusion is necessarily incorrect, only that the argument itself is wrong. However, fallacious arguments can have surprising conclusions, as shown below:
1. | Let | x = 1 |
2. | Multiply each side by x | x² = x |
3. | Subtract 1 from both sides | x² − 1 = x − 1 |
4. | Divide both sides by x − 1 | x² − 1⁄x − 1 = 1 |
5. | Simplify | (x − 1)(x + 1)⁄x − 1 = 1 |
x + 1 = 1 | ||
6. | Subtract 1 from both sides | x = 0 |
7. | Substitute the value of x from step 1 | 1 = 0 |
The fallacy here is subtle. In step 2, multiplying both sides by x introduces an extraneous solution to the equation of x = 0. Then, in step 4, there is a division by x − 1, which is an illegal operation because x − 1 = 0 and you can't divide by zero. This illegal operation has the effect of leaving the extraneous solution x = 0 as the only solution to the equation.
You can use a similar method to "show" that any number is equal to any other number. For example, to show that 7 = −4:
If you've taken mathematics in high school or first-year university or college, you may have learned that an infinite geometric series has a sum if it is convergent; that is, if the ratio between any term and the previous term is less than 1 and greater than −1. Geometric series that are not convergent cannot be summed. If you ignore this, then you can come up with all kinds of strange results. For example:
However, we could also group the series differently:
Since S is equal to both 1 and 0, then 1 = 0.
Here's another example:
Subtract 1 from both sides of the equation:
Now, we can multiply the original equation by 2, getting:
Both the second and third equation have the same sum, so:
But it is absurd that the sum of this series could be negative, since all of the terms are positive. The fallacy lies in assuming that a divergent series has a sum.
10¢ is equal to 1⁄10 of a dollar. If we square both numbers, we get 100¢ = 1⁄100 of a dollar. But 1⁄100 of a dollar is just 1¢! Where did the other 99¢ go? Well, we didn't treat the units properly. If we square 10¢, the units are not ¢ and $, but ¢² (cents squared) and $² (dollars squared). The conversion factor between square dollars and square cents is 10,000, not 100, so the paradox vanishes.
You may also be interested in logical fallacies or in mathematical paradoxes. You may also be interested in a "proof" that you never go to school.