When I was in elementary school, my school had several Commodore 64 computers (I know, I'm dating myself here). The Commodore 64 was the first computer that I ever programmed. The first program of any real size, written in BASIC, that I can recall writing by myself was a number guessing game. As I recall, the program would randomly select a number between 1 and a number that the user input. The user would then make guesses as to what the randomly selected number was. The program would tell the user if the guess was too high or too low, or if it was correct. Of course, you could also play this game with another person instead of a computer.

Say that you were playing this game, and you were guessing a number between 1 and 1,000. Would one number be a better first guess than another? How many guesses should it take to guess the number?

If you think about it, a first guess of 1 probably isn't a good idea. Unless the program had selected the number 1, the program would always tell you that your guess was too low, which wouldn't provide you any new information about where the number lies. A guess like 500 would be a better guess. That way, whether the guess is lower or higher than the number selected, you've eliminated around 500 possibilities. On the other hand, if you guess 1, you've only eliminated one possibility.

Let's say your first guess of 500 is too low. For your next guess, you could choose a number around 750, which would allow you to eliminate half of the remaining numbers. Continuing this pattern should allow you to guess every number in no more than 10 guesses or so. For example, if the number chosen were 1,000, you could find it in 10 guesses: 500, 750, 875, 938, 969, 985, 993, 997, 999, 1000.

What if the game was changed such that you had to guess a number between 1 and 10,000? Or maybe 1 and 1,000,000? How many guesses would you need then?

What about if we changed the information that's given? For example, take the game "Bagels," which I found on page 61 of the book Math for Smarty Pants (the game may have been published elsewhere before, but I don't know offhand). In this game, one player chooses a three-digit number. Whenever the other player makes a guess, the player choosing the number gives some clues, as follows:

**Pico**means that one of the guessed digits is correct but is in the wrong place,**Fermi**means that one of the guessed digits is correct and is also in the incorrect place,**Bagels**means that none of the guessed digits are correct.

On this site's sister site All Fun and Games, I've created a version of "Bagels" that you can try (although I've replaced the clue names with blue and orange stars).

In this game, how many guesses would you need to be guaranteed to guess the number? Play the game for a while and make a guess as to what the minimum required number of guesses might be. I've found that, if I think carefully about each guess, I can almost always guess the number in 8 guesses.

It turns out that a solution to this problem can be found. Bagels is similar in principle to a game called Mastermind, a game created in 1970. Mastermind is played as follows: One player selects a sequence of four colours, where the colour for each position may be selected from one of six colours. The second player then tries to guess the sequence, and is given hints of the same sort as given in Bagels. In 1977, Donald Knuth showed that there was a strategy that allowed the player to always guess the sequence of colours in no more than 5 guesses. In 1996, Zhixiang Chen and others published an optimal strategy for Mastermind using any number of colours and positions. Getting back to Bagels, if you permitted leading zeroes, this problem would boil down to the solution to a generalized Mastermind with 10 colours and 3 positions.

Sources used (see bibliography page for titles corresponding to numbers): 32, 54.