Tags Math
This "paradox" is trivially solved with basic probability and expectancy calculations, but still it's very cool:

Assume a casino roulette. In the US, a roulette has 38 numbers, 0 and 00, 13 black numbers and 13 red numbers. Let's assume for simplicity that a roulette has 50 red, 50 black and 1 green (0) numbers.

Now, on one hand it's a well known fact that the casino wins with the roulette. Say, for instance that I bet \$100 on red. There's a 50/101 chance for me to win, and 51/101 to lose. In the long run, I lose, the casino wins.

But lets consider the following strategy:

I bet \$1 on red. If I win, I've won a dollar. If I lose, I double my bet - that is, bet \$2 on red. If I win, I take my \$2 and again start betting with \$1, if I lose I double my bet - \$4.

Think about it... eventually, a red will appear (the chances are a little smaller than 1/2 on each throw). When it will appear, I've won exactly \$1, because my current bet is the sum of all previous bets + 1.

2^N - 1 = Sigma{n = 0 to N - 1} 2^n)

In more familiar terms, think of a binary tree. In a full (complete) tree the number of leaves is always the number of all non-leave nodes + 1.

In short, in each series of betting \$1, then \$2, then \$4, etc, UNTIL I WIN, I will eventually earn \$1. And hey, the chance that NO read will appear is very low even after 4-5 throws.

So, why can't I rob the casino this way :-) ?