weaselman Joined: Jul 11, 2010
• Posts: 2349
September 5th, 2010 at 9:22:55 AM permalink
Quote: jfalk

This problem has indeed stumped a lot of people, but the answer is fairly simple. The reason you can't do an expected value calculation here is that there is no proper (ie integrates to 1) distribution with the feature that the probability of 2x is equal to the probability of x except for the somewhat silly distribution which is zero with probability 1.

This is exactly what I thought before, but the discussion in this thread convinced me, that this "solution" is wrong.
First of all, the probabilities do not have to be equal. As long as the probability of 2x is more than 1/3, it is advantageous to switch. An example of distribution like this is putting 2^n and 2^(n+1) into the envelopes with the probability 2^n/3^(n+1). These probabilities do add up to 1, and they make the expected value of switching 11/10x regardless of what the x value is.

An even better argument against your solution is a slight modification to the problem. Suppose you are offered to switch the envelope BEFORE opening it. If you suppose that your envelope contains x dollars, the probability that the other one has 2x is 1/2, regardless of the actual distribution used to stuff the envelopes. So, the probability theory tells you to switch the envelope even before opening it.

It looks like the decision theory really does break down on this one, it is not a trick, and there is no flaw we are missing here, it's just a case where it does not work.
"When two people always agree one of them is unnecessary"
Wizard Joined: Oct 14, 2009
• Posts: 23444
September 5th, 2010 at 9:55:12 AM permalink
Quote: MrPogle

...Before you open the envelope pick a random value you would like to win. If the envelope contains at least that amount, keep it but if it doesn't swap for the second one. There are three scenarios.

You're absolutely right that such a strategy would result in an expected value greater than the average of the two envelopes. Another such strategy is to switch with probability c/(c+x) where x is the amount in the first envelope. Set c to the amount you think the host would likely put in the average envelope. Granted this can be hard to judge, but picking any c>0 will improve your odds.

Quote: jfalk

I think the answer to your question is: "No." Although if you want to be technical about it, you don't actually need integral calculus, since the same logic applies for Riemann integration...You can't fix x before you make the EV calculation, since x represents a point along the distribution: you have to integrate across all x to solve the problem. If there was a distribution for which density(x) = density(2x) for all x, then that step is unneeded, so you leave it out... that's the "divide by zero" part.

I don't dispute a word you said. However, what if Bill Gates was the host and the first envelope has \$100. Do you switch? Since Bill Gates can clearly afford \$200, I don't think the impossible integral argument helps much.

I've said this before, but I can't shake the feeling that there is a simpler answer. Suppose somebody asked this before integral calculus and Riemann' rel='nofollow' target='_blank'>http://en.wikipedia.org/wiki/Riemann_integration]Riemann integration was invented. Would they have had to just throw up their hands in frustration? I say no. I still say it is a fundamental abuse by confusing the issue, much like the Missing Dollar problem.

Let me pose this question. Suppose I run a mutual fund. In my prospectus I include an independent actuarial certification that says that my average growth rate per day was 25%, defined as the average of the daily gains. For example, if I made 100% profit on day 1 and lose 50% on day 2, then my average daily gain is 25%. However, I can prove I've maintained this average for years. Do you invest? If not, why not?
It's not whether you win or lose; it's whether or not you had a good bet.
jfalk Joined: Sep 2, 2010
• Posts: 29
September 5th, 2010 at 10:50:05 AM permalink
There is indeed another perspective which eliminates the need for calculus. Another way is to to adopt a traditional frequentist perspective: x is the amount of gain for switching. Then when you switch you either gain x (if you got the small envelope) or lose x (if you had the big envelope). Expected gain: 0. Simple enough? But I'm afraid the reason the other one doesn't work requires The error still comes from thinking of x, the amount in the envelope you hold, as a random variable. If you want to think of it that way, you have to integrate over the distribution of x to solve the problem, and you can't.

To your second question, my answer is the same one the government makes them say: Past returns are no guaranteeof future results. To make this concrete (and give another Wikipedia reference) look up the Peso Problem: http://en.wikipedia.org/wiki/Taleb_distribution
Wizard Joined: Oct 14, 2009
• Posts: 23444
September 5th, 2010 at 2:16:12 PM permalink
Quote: jfalk

Another way is to to adopt a traditional frequentist perspective: x is the amount of gain for switching. Then when you switch you either gain x (if you got the small envelope) or lose x (if you had the big envelope). Expected gain: 0. Simple enough?

It isn't very satisfying to me to say that the flaw in the 1.25x argument is that you should be asking about the expected additional money, not the expected additional rate. If you try to say that to the layman he will ask why can't I average the possible rates of change.

I think my investment analogy is apropos. Suppose there was a casino game where you doubled your bet 50% of the time, and lost half the other 50%. Indeed, the player advantage would be 25%. However, if the dealer cheated and alternated wins and losses, then the player advantage would be 0%. So a fair game and the rigged game both approach 50% doubles over time, but one returns 125% and the other 100%. Why? I suggest it is because in the cheating game you are always applying the 200% return to small bets, and the -50% to big bets. The same is happening with the envelope game, you're either doubling off the small amount, or halving the big one. For the same reason there is no gain with the cheating dealer, there is no gain with the envelope offer. No calculus required.
It's not whether you win or lose; it's whether or not you had a good bet.
weaselman Joined: Jul 11, 2010
• Posts: 2349
September 5th, 2010 at 3:47:10 PM permalink
Quote: jfalk

There is indeed another perspective which eliminates the need for calculus. Another way is to to adopt a traditional frequentist perspective: x is the amount of gain for switching. Then when you switch you either gain x (if you got the small envelope) or lose x (if you had the big envelope). Expected gain: 0. Simple enough?

No, you gain x, but you lose only x/2.
The amount in the envelope is indeed a random variable. The problem is that it has infinite mean. Apparently, the decision theory breaks down in that case.
"When two people always agree one of them is unnecessary"
jfalk Joined: Sep 2, 2010
• Posts: 29
September 5th, 2010 at 8:00:23 PM permalink
I agree that if you think of the amount of money in the envelope as a random variable, you have exactly the problem I talked about above, and that you cite here (it's not technically an infinite expectation... it's really undefined) but I didn't say x was the amount in the envelope... I said it's the amount you win. The amount you win if you switch and had the small envelope is exactly the amount you lose if you switch and you had the big envelope. That's what I'm calling x... an amount equal to half the big envelope or all of the small envelope. Since those numbers are the same by definition, I can call them x.
MathExtremist Joined: Aug 31, 2010
• Posts: 6526
September 6th, 2010 at 8:23:55 AM permalink
I think you can highlight the infinite switching problem by reversing the statement. Suppose you know the structure of the envelopes (X and 2X), and I hand you one envelope. Then *I* open the other one, revealing \$100. Does keeping your envelope mean you have an EV of \$125? If you continue to "keep" the envelope, does it increase in value at the rate of 125% each time you "keep" it? Of course not.

If you and I play a coin-flip game, we both know the outcome distribution (heads/tails). Same with dice, same with cards, etc. That's how we can properly compute the EV for those games. For example, suppose I offered you a bet on a coin flip. You bet \$100. Heads wins \$100, tails loses \$50. Do you have the edge? Of course -- and it's the same edge as has been erroneously computed for the value of the other envelope.

But in the envelope problem as stated, nobody's flipping a coin or using some other 50/50 process to determine the value of what's in the other envelope. That value has already been determined in a 100% deterministic manner. You just don't know what it is, nor do you know the probability distributions for those values.

Let's call X the amount in the small envelope. This is very different than calling X the value you get (or I get) in the other envelope examples so far, and which leads to the value of the other envelope being 1.25X. (As above, if you *are* playing a coin-flip game, the other envelope *is* worth 1.25X).

If X is the value in the small envelope, 2X is the value of the large one. There is no X/2 in this setup. You just don't know which envelope is which, so if you have ordered pairs of envelopes, you have either [X, 2X] or [2X, X]. Because you don't know which is which, the EV of both envelopes is 1.5X. When you pick an envelope and reveal \$100, that means you're holding either X or 2X, but you *still* don't know which is which. If you switch, you may get \$200 or you may get \$50, depending on whether \$100 = X or \$100 = 2X, but it's not a coin-flip at that point since the other value has already been determined. Intuitively turning "I don't know" into a coin-flip is how you get to the improper 1.25 result.
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
Wizard Joined: Oct 14, 2009
• Posts: 23444
September 6th, 2010 at 9:16:44 AM permalink
Quote: MathExtremist

When you pick an envelope and reveal \$100, that means you're holding either X or 2X, but you *still* don't know which is which. If you switch, you may get \$200 or you may get \$50, depending on whether \$100 = X or \$100 = 2X, but it's not a coin-flip at that point since the other value has already been determined. Intuitively turning "I don't know" into a coin-flip is how you get to the improper 1.25 result.

I'm not saying you're wrong. However, if it isn't 50/50 between \$50 and \$200, then what are the odds of each. To make the EV of the other envelope equal to \$100, then pr(50)=2/3, and pr(200)=1/3. However, that seems arbitrary.

I think you would agree that before you opened your envelope your odds of it being the higher one were 50%. Why is it different after you open it? What is the probability the other envelope is \$200 now? Just playing the devil's advocate here, mind you.
It's not whether you win or lose; it's whether or not you had a good bet.
MathExtremist Joined: Aug 31, 2010
• Posts: 6526
September 6th, 2010 at 9:36:04 AM permalink
Quote: Wizard

I'm not saying you're wrong. However, if it isn't 50/50 between \$50 and \$200, then what are the odds of each.

That's precisely the point - it's impossible to know. This isn't a math problem, it's an epistemological one. The problem comes when you try to solve it using math by injecting an assumption (50/50) into the mix. But once you accept that it's impossible to know whether the \$100 is the smaller or larger amount, you have no rational basis for switching. Assigning the likelihood of the other envelope holding \$200 a 50% probability is improper: the likelihood of the other envelope containing \$200 is either 100% or 0%, and you have no other information.
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
weaselman Joined: Jul 11, 2010