This week's Riddler presents us with an interesting casino game:

Suppose a casino invents a new game that you must pay $250 to play. The game works like this: The casino draws random numbers between 0 and 1, from a uniform distribution. It adds them together until their sum is greater than 1, at which time it stops drawing new numbers. You get a payout of $100 each time a new number is drawn.

For example, suppose the casino draws 0.4 and then 0.7. Since the sum is greater than 1, it will stop after these two draws, and you receive $200. If instead it draws 0.2, 0.3, 0.3, and then 0.6, it will stop after the fourth draw and you will receive $400. Given the $250 entrance fee, should you play the game?

Specifically,

what is the expected valueof your winnings?

To answer the first part of the question ("should you play?"), let's think about it like this: The largest number that can be drawn on a single turn is 1, and the game doesn't end until the sum of draws is greater than 1, so every game will last at least two turns. And, in fact, after two turns there's a 50% chance that it will be greater than 1. So half the time we'll lose exactly $50, and the other half of the time we'll win *at least* $50. Sometimes we'll win more than $50, and those cases (when the game lasts 4+ turns) push our expected profit into the black.

So we know we should play. How much do we actually expect to win by playing?

I think there are at least a few different ways to approach this problem. Instead of drawing from a continuous distribution, let's instead imagine that we divide the range from 0 to 1 evenly into discrete numbers. For example, we'll let = 5, and the pool of numbers we're drawing from is [0.2, 0.4, 0.6, 0.8, 1.0].

Now let's work backwards. Imagine we already have accumulated a sum of 1. How many more draws will we expect to have until the game ends? Obviously just one, as any number we draw will put us over the needed total.

Let E[] denote the expected number of draws we'll need to exceed 1, starting with a sum of . Thus, E[1] = 1.

What about E[0.8]? We'll always need at least one more draw. In addition, 1/5 of the time this will be 0.2 and we'll end up in the E[1] case. So E[0.8] =

And for E[0.6]? We'll always need at least one more draw. 1/5 of the time this will put us in the E[0.8] case, and another 1/5 of the time we'll be in the E[1] case. Thus E[0.6] =

And E[0.4] =

There's a pattern emerging here. In general, for a given ,

Ultimately we're interested in the case E[0], so letting :

Finally, to go back to the continuous case of the original puzzle:

So if it costs $250 to play and we win $100 for each draw, we'll expect to win $100e - 250 $21.83 each time we play.