Suppose you want to slice a cake, and you want to know how many pieces you get with 1 cut, or 2 cuts, or 3 cuts, etc. You don't have to start slicing from the middle, and each piece of cake need not be the same size, but you have to slice all the way through the cake.
So you get slices, and pieces of cake, like this:
The number below each circle is the number of pieces of cake. Each dot is where your knife enters or leaves the edge of the cake.
For n slices, how many pieces of cake will you get?
For a small number of slices, it's easy to count up the pieces:
At this point most people would notice a pattern, and guess/assume the number of pieces of cake is 2n-1.
But, surprisingly, that's wrong.
For n=6, the number of pieces is 31, not 32 as one would likely guess.
For n=7 the number of pieces is 57, not 26 = 64. And so on: 1, 2, 4, 8, 16, 31, 57, 99, 163, 256...
That's kind of wonderful -- it shows you shouldn't try to detect a pattern or rule based on a limited number of trials.
In mathematics this is called Moser's circle problem, and the slices are called "chords." The formula for the number of pieces is
where the objects in parantheses are called combinations -- the first is the combination of n objects taking 4 at a time, and not caring about their order, etc. Here are a couple ways of solving the problem.
This was proved in 1987, showing that not all simple but interesting math problems were solved, or even proposed, 300 years ago. (Moser formulated another seemingly simple problem, the worm problem, in 1966, and it's not solved yet.)
And it says to be careful in your thinking, more careful than most of us are.