I’m not much of a mathematician but I do like to go to Atlantic City, so this is my take on the paradox. I suspect this approach was previously taken by someone else and I would appreciate any feedback.
It appears to me that whatever fee one decides to pay, the odds of losing that fee(investment) is 50%.
What are the odds of breaking even?
If your fee is $2, the odds are 50%
If your fee is $4, the odds are 25%
If your fee is $8, the odds are 12.5%
If your fee is $16, the odds are 6.25%
If your fee is $32, the odds are 3.125%
If your fee is $64, the odds are 1.5% (rounded)
If your fee is $128, the odds are .75%.
Therefore the odds of losing $128 is approx 99%, very bad bet.
Since the odds of losing one’s fee is never in one’s favor a reasonable person would not play the game at $2. If you don’t play there is a 100% chance of keeping $2, if you play there is only a 50% chance of keeping $2.
Playing at a higher fee is more unreasonable….
…. on the other hand if you pay only $1 you still have a 50% chance of losing that dollar but you also have a 50% chance of DOUBLING your investment. The less you pay still keeps you at 50/50 but you can triple, quadruple… your investment.
So I think a reasonable person could play for less than $2.
I think the utility theory comes into play only when you change the starting amount of the game. As a middle class cautious gambler I would be comfortable paying a $20 fee for a game that started at $20 and a $50 fee for a game that started at $100.
I would not be comfortable paying a $100 fee for a game that started at $100 or $200.