“There are two envelopes in front of you each with a non-zero number. You will receive an amount of money equal to the final envelope you choose. You are informed one has twice as much money as the other. You are then allowed to select either envelope. After you select one and before opening it you are given the option to change your mind and switch to the other one? You think to yourself that if your envelope has x dollars there is a 50% chance the other one has x/2 dollars and a 50% chance it has 2x dollars. The expected return, you compute, is .5[.5x + 2x]=1.25x which seems like a favorable gamble. Do you switch and why? Assume you are neither risk averse nor risk prone, in other words you will take any good gamble and avoid any bad one.”

The solution presented is that there is no point in switching, which is correct. But the explanation why is rather convoluted, and doesn't really get to the heart of the issue, I think. It discusses extreme scenarios, where the entire wealth of the world is involved. But it doesn't matter how much money is involved to show why switching doesn't increase your expected return. Here's the best explanation I could come up with re why switching doesn't increase your expected return, and why it isn't appropriate to use the formula stated in the problem to determine expected return.

It is not appropriate to apply the “.5[.5x + 2x]=1.25x” formula to determine expected return in this problem. Because fundamentally, this formula covers 3 possibilities and 3 distinct returns: 1) You don’t switch and get $X; 2) you switch and maybe get $2X; or 3) You switch and maybe get $0.5X.

Whereas in the actual problem there are 4 possibilities but only two distinct returns: 1) You’re holding $X, don’t switch and get $X; 2) You’re holding $X, switch and get $2X; 3) You’re holding $2X, don’t switch and get $2X; or 4) You’re holding $2X, switch and get $X. So you’re expected return if no switch (covering 1) and 3)) is 0.5[x+2x]=1.5x and your expected return if switch (covering 2) and 4)) is 0.5[2x+x]=1.5x. So no point in switching.

Now, let’s say a different game is played. In this game, a man gives you an envelope and tells you there is $X inside. He then shows you two more envelopes, tells you that one holds $2X and the other holds $0.5X, and says you may switch your envelope for one of the other two. Do you switch?

Yes, because the formula “.5[.5x + 2x]=1.25x” applies in this case – i.e. you can keep what you have and claim $X, or switch and maybe claim $0.5X, or switch and maybe claim $2X. So your expected return if switch is $1.25X. They key difference here is that there are 3 possibilities returns, not 2 as in the actual problem.

What bothers me about your explanation is that once an envelope is in your hand, the amount isn't random any longer. What is in there is what is in there.

Agreed, it isn't random. Once the envelope is in your hand, you either have $X or $2X. Those are the only two possibilities. And there are only two possibilities after switching - $X or $2X. There is no way to end up with 0.5X, X, or 2X in this problem. You can only end up with X or 2X (or, 0.5X or X, depending on how you define X). Whereas, ending up with 0.5X, X, or 2X is possible when using the formula. To me, that's a very simple explanation for why it is flawed. The Cal Tech guy may have a more complicated explanation to come to the same conclusion, but that doesn't mean it's the only explanation.

Yes, X can be any arbitrary amount, but what you're holding isn't random - you possess specific information about what is in the two envelopes to start with, that can be applied to determining expected return. Which (in my thought process here) removes the concept of "randomness" from the problem.

If a man gave you an envelope with $X in it, and then offered you a 2nd envelope to switch with, and told you there was an equal probability that the envelope contained $0.5X or $2X, then using the .5[.5x + 2x]=1.25x formula to determine expected return is correct. But that is clearly a different problem from the stated one.

Quote:MathGentMaybe the fundamental flaw in using the .5[.5x + 2x]=1.25x formula is that it assigns a single variable "X" to represent (in this particular problem) 2 distinct starting points - let's call them Y and 2Y, so we don't confuse variables. It's clear (at least to me) that we have to consider each starting point separately in order to determine the expected return. Which is what I showed in the 5th paragraph of my original 7-paragraph post. Whereas this formula tries to assign one variable "X" to represent either of the two starting points. The flaw occurs because if X=Y, then the formula states that there is a 50% chance that the return is 0.5X = 0.5Y, which is impossible in the problem. Similarly, if X=2Y, then the formula states that there is a 50% chance that the return is 2X=4Y, which is also impossible in the problem.

You're logic makes perfect sense to me, and it has from the beginning.

So, if someone says, ".5(.5x + 2x)=1.25", and therefore you should switch bets, and you switch bets and you end up with x, instead of the 2x that you originally had, is it okay to then say, "YOUR MATH F_ING SUCKS"? Just curious if that would be an appropriate response or not :)

Quote:CrystalMathYou're logic makes perfect sense to me, and it has from the beginning.

Thanks!

Quote:JyBrd0403So, if someone says, ".5(.5x + 2x)=1.25", and therefore you should switch bets, and you switch bets and you end up with x, instead of the 2x that you originally had, is it okay to then say, "YOUR MATH F_ING SUCKS"? Just curious if that would be an appropriate response or not :)

Not sure about appropriate, but understandable. Like it would be if you bet big on the flop in Hold'em, and then lost on runner Aces. :)

Quote:JyBrd0403"Today's scientists have substituted mathematics for experiments, and they wander off through equation after equation, and eventually build a structure which has no relation to reality" (Nikola Tesla)

I would skip all of the x/X's which are general multipliers of the amounts in the envelopes, and specific multipliers between the two amounts in the envelopes.

Figure envelopes A and B apart - without reference to a variable x for the amount in one of A, or B - to obtain the expected value of 1.50; figure envelopes A and B together - with reference to a constant X, from which the amounts derive - to obtain the expected value of 1.25.

Figure envelopes A and B apart as figure A and B together - with reference to a constant X for variable x - to lock in the "nutty" expected value of 1.25x/X; figure envelopes A and B together as figure A and B apart - with reference to a variable x for constant X - to lock in the "nutty" expected value of 1.50x/X.

The natural parts even out alongside the "nutty" corresponding parts' duality of referencing apart as together.

I wonder, might the "nutty" be a mechanism for the dual (quantum) "bootstrap" (to pull itself up out of the nothingness)? Not quite there yet.

Haven't really thought any of this through. Offhand, not sure about visualizing the 1.50X part.

When you choose the first envelope you get a 50/50 chance of choosing either 1x or 2x.

If you switch you will still have the same 50/50 chance of opening either 1x or 2x, so there can be no advantage to switching.

There are only two possible outcomes no matter how many times you switch or shuffle envelopes.

Each possible outcome has an equal probability of occurring.