Quote:NareedPerhaps the spirit does, but the letter of the question does not ...

Actually, the expression X!!!... makes no sense mathematically. There is no such thing as doing an operation infinitely many times for integers. To say (11)!!!!!! .... (whatever) = infinity, for example, does not make sense on either side of the = sign.

If one defines F(n) = (11)!!!!...!!! (n factorials) then certainly lim (n->infinity) F(n) = infinity. But, then you need to invoke limits into your expression as well.

And to say a limit equals infinity is just to say that for an N there exists a value M such that for n > M, F(n) > N. The limit definition itself makes no mention of "infinity" except as a notational convenience.

--Dorothy

Quote:DorothyGaleIf one defines F(n) = (11)!!!!...!!! (n factorials) then certainly lim (n->infinity) F(n) = infinity. But, then you need to invoke limits into your expression as well.

You know, back in highschool I figured out that, for many exam questions, I could sub the multiple choice answers in the variables given, thereby getting a much better grade in math than I deserved.

It rather shows, doesn't it?

The problem is from an old, old, old issue of Discover magazine's brain teasers page. I've no idea how come I remembered it for so many years.

Quote:konceptumMy reasoning for this is based loosely upon Pascal's Wager, and the fact that there is nothing to lose.

Not a real wager, if "nothing to lose". Eg, just try to imagine, or believe by it, how much time can be wasted worshiping something "up there", instead of just moving on with one's life? (If every one played his/her own game(s) well, there would be no more global wars, or who knows what else bad... when by finesse, a skirmish here and there would suffice.)

Most themes being as real as fantastic, in one way or other... there will always be something to win/lose by something from each.

Quote:weaselmanRight, but that's exactly where the logical flaw leading to the paradox is hidden. It is not possible to create a distribution satisfying such property for any amount in the original envelope, because you cannot pick a random value from an infinite range such that the all values are equally likely to appear.

Try picking any rule you like of distributing the money between the envelopes, and you won't be able to satisfy the requirement that for any amount found in the first envelope, finding double and half that in the other one is equally likely. For any given rule, there will be situations when it is better to switch, and those when it is not.

Someone smarter than me showed that this is not true. There do exist proper prior distributions of values such that it is always correct to switch. The wikipedia on this page suggests the one where the envelopes contain the values 2^n and 2^(n+1) with probability 2^n / 3 ^ (n+1) for all integer values of n = 0,1,2...

That is, there is probability 1/3 the envelopes contain 1 and 2, probability 2/9 they contain 2 and 4, and so on. It can be shown you should always switch after looking in the first envelope.

Here is my a-ha moment: the flaw in the 125% logic is applying an EV calculation to variables with distributions that effectively have a mean of infinity! The only way it can always be correct to switch is if the weighted average of all possible envelope values is actually infinte. When you apply an EV calculation to a variable with inifinte mean, wonky things happen.

I'm not saying that if you switch back and forth, the money in the envelope magically grows. I'm saying that it is POSSIBLE to play a game where it IS always correct to switch, and when you do open that envelope, it could actually contain infinite money. That's why the 125% argument 'fails': in practice there is an upper bound on the amount of money in the envelope, so there is a point when the probability of doubling your money is LESS THAN 50%.

If there really is no upperbound on the amount of money in the envelopes, it doesn't matter if you switch or not, because doubling infinity or halving infinity gives you... the same thing?!

Paradox resolved. Sort of.

Look at the EVs from the point of view that one envelope contains $100 and the other envelope has a 50% chance of containing either $50 or $200.

Possibility 1: 50% chance you choose the envelope with $100

50% chance you will gain $100 by opening the other envelope

50% chance you will lose $50 by opening the other envelope

Total EV weight for Possibility 1:

25% chance you will gain $100 (50% * 50%)

25% chance you will lose $50

Possibility 2: 25% chance that you chose an envelope with $200

100% chance you will lose $100 by opening the other envelope

Total EV weight for Possibility 2:

25% chance you will lose $100 (25% * 100%)

Possibility 3: 25% chance that you will chose an envelope with $50

100% chance you will gain $50 by opening the other envelope

Total EV weight for Possibility 3:

25% chance you will gain $50 (25% * 100%)

The summary of the EVs for all possible cases of switching are:

25% chance you will gain $100

25% chance you will lose $50

25% chance you will lose $100

25% chance you will gain $50

Total EV (in terms of expected gain) for switching envelopes: $0

If I did KNOW there was a 50-50 split of being in either situation, and I DIDN'T look in the envelope (neither of which are valid assumptions in the original problem) your EV calcs are right.

But if i DO look in the envelope and see $100, and have prior knowledge of the distribution, I should still switch.

If you have no knowledge of the distribution, and all you know is that one envelope contains $X, and the other contains either $2X or $.5X, then:

You have a 50% chance of choosing the envelope that has $X. If you are lucky enough to randomly choose this envelope you do, indeed, have an EV of +$.25X in switching.

But, you also have a 50% chance of choosing the envelope that has either $2X or $.5X. If you are unlucky enough to randomly choose this envelope then you have, unfortunately, an EV of -$.25X in switching.

Even after you open the envelope, you have no way of knowing which one you have chosen. For example, if you open the envelope and it contains $600, you have no way of knowing if it was the envelope that could have contained $150 or $600, or if the other envelope might contain $300/$1200.

X = amount in envelope that cannot change

Y = 2*X = possible amount in other envelope (50% probability)

Z = .5X = other possible amount in other envelope (50% probability)

When you open an envelope, you have:

50% chance of finding X

25% chance of finding Y

25% chance of finding Z

As a result, by switching, you have:

25% chance of winding up with: Y - X (you get Y, but give up X)

25% chance of winding up with: Z - X (you get Z, but give up X)

25% chance of winding up with: X - Z (you get X, but give up Z)

25% chance of winding up with: X - Y (you get X, but give up Y)

Total gain/loss = (Y-X) + (Z-X) + (X-Z) + (X-Y) = 0

There is also a coralary to the paradox: Assuming X is a whole positive integer number, then in order to have a zero EV X must also be a even integer.

If you were not assured that X was an even number, then you would receive information just by opening the envelope. If the envelope contained a whole dollar amount, then there is a -.5X EV if you switch the envelope. If the amount of the envelope is a fractional dollar amount (e.g. $3.50), then there is a a +.5X EV if you switch the envelope. So, if you knew that X was a whole dollar amount, but you were not assured that X was an even number, you would always keep the envelope if it contains a whole dollar amount, and always switch if it contains a fractional dollar amount.

Quote:dwheatley

Someone smarter than me showed that this is not true. There do exist proper prior distributions of values such that it is always correct to switch. The wikipedia on this page suggests the one where the envelopes contain the values 2^n and 2^(n+1) with probability 2^n / 3 ^ (n+1) for all integer values of n = 0,1,2...

I don't see how this means that what I said "is not true". Obviously, the probabilities of finding 2x and x/2 in the second envelope are not equal to each other, and depend on x. For example, if your envelope contains 1, the probability that the other one has 2x is 1, and the probability of x/2 is 0.

For other cases, the probability of x/2 is 3/5, and that of 2x is 2/5 - still not equal.

But, still, this is an impressive example. I did not think of this possibility ...

Quote:DorothyGaleActually, the expression X!!!... makes no sense mathematically.--Dorothy

Yes, it does. It means, "X is coming! RUN!!!"

This is an exception to my previously posted statement that mathematical equations are largely incapable of illustrating abstract concepts.

Generate a random outcome between 0 and 1 to represent a value placed in one of the evelopes (x).

Based on that random outcome create another value that is twice that amount to represent the other envelope (2x).

Generate a second random outcome between 0 and 1 representing the initial choice of the player, where if that outcome is >0.5 the first envelope is chosen and if it is <0.5, the second envelope is chosen.

Subtract the unchosen envelope from the chosen one for the change in wealth.

Loop that a substantial number of times counting each decision and divide the accumulated wealth by the number of decisions.

Should trend to 0.

Seems to me that this would simply be 0.5(x-2x)+0.5(2x-x) = 0.

Note, by the way, that if the distribution is finite, the math works out fine, since you now have values in the envelope in which the probability of 2 times that value is zero. (Any value greater than half the finite limit on the distribution). I leave as an exercise for the reader to show that in this case switching envelopes is a wash.

Quote:jfalk...The reason you can't do an expected value calculation here is that there is no proper (ie integrates to 1) distribution with the feature that the probability of 2x is equal to the probability of x ...

I've read that argument before. You've stated it well, and I agree that is a flaw in the EV=1.25x argument. However, is an understanding of integral calculus necessary to see the light? My gut tells me that there should be a way to debunk the EV argument with just simple algebra.

p.s. Welcome to the forum, I hope you'll stick around (I know this guy, he is very smart).

Quote:WizardI've read that argument before. You've stated it well, and I agree that is a flaw in the EV=1.25x argument. However, is an understanding of integral calculus necessary to see the light? My gut tells me that there should be a way to debunk the EV argument with just simple algebra.

p.s. Welcome to the forum, I hope you'll stick around (I know this guy, he is very smart).

You can also debunk it with third grade arithmetic, or simple common sense, in that if it were possible to add EV by switching, it would also be possible to increase the value of each envelope infinitely by infinitely switching. I would imagine that this intuitive conclusion is mirrored by the GIGO effect that happens when you plug "zero" into one of those complex calculations.

I do have an interesting question (which I am posting on a new thread) that is probably best solved by complex analysis, because the obvious answer seems counterintuitive.

Quote:mkl654321Yes, it does. It means, "X is coming! RUN!!!"

STOP!!!...

--Dorothy

Quote:mkl654321You can also debunk it with third grade arithmetic, or simple common sense, in that if it were possible to add EV by switching, it would also be possible to increase the value of each envelope infinitely by infinitely switching. I would imagine that this intuitive conclusion is mirrored by the GIGO effect that happens when you plug "zero" into one of those complex calculations.

I do have an interesting question (which I am posting on a new thread) that is probably best solved by complex analysis, because the obvious answer seems counterintuitive.

Yes, that makes perfect sense, but it is the easy way out. The question is where is the flaw in the EV=.5*(2x+0.5x)=1.25x arguement?

I look forward to sinking my teeth into your complex analysis problem.

Before you open the envelope pick a random value you would like to win. If the envelope contains at least that amount, keep it but if it doesn't swap for the second one. There are three scenarios.

1) Both envelopes contain more than your target. You never switch but this does not effect your overall return.

2) Both envelopes contain less than your target. You always switch but this does not effect your overall return.

3) One envelope has more than your target and one less. In this case you will (without knowing) keep the 2x envelope if you pick it and swap the x envelope if you pick it (winning 2x every time).

Because some of the time 3) will occur, your overall return on the game should be fractionally better than 3/2x.

Shouldn't it?

Quote:jfalkThis problem has indeed stumped a lot of people, but the answer is fairly simple. The reason you can't do an expected value calculation here is that there is no proper (ie integrates to 1) distribution with the feature that the probability of 2x is equal to the probability of x except for the somewhat silly distribution which is zero with probability 1.

This is exactly what I thought before, but the discussion in this thread convinced me, that this "solution" is wrong.

First of all, the probabilities do not have to be equal. As long as the probability of 2x is more than 1/3, it is advantageous to switch. An example of distribution like this is putting 2^n and 2^(n+1) into the envelopes with the probability 2^n/3^(n+1). These probabilities do add up to 1, and they make the expected value of switching 11/10x regardless of what the x value is.

An even better argument against your solution is a slight modification to the problem. Suppose you are offered to switch the envelope BEFORE opening it. If you suppose that your envelope contains x dollars, the probability that the other one has 2x is 1/2, regardless of the actual distribution used to stuff the envelopes. So, the probability theory tells you to switch the envelope even before opening it.

It looks like the decision theory really does break down on this one, it is not a trick, and there is no flaw we are missing here, it's just a case where it does not work.

Quote:MrPogle...Before you open the envelope pick a random value you would like to win. If the envelope contains at least that amount, keep it but if it doesn't swap for the second one. There are three scenarios.

You're absolutely right that such a strategy would result in an expected value greater than the average of the two envelopes. Another such strategy is to switch with probability c/(c+x) where x is the amount in the first envelope. Set c to the amount you think the host would likely put in the average envelope. Granted this can be hard to judge, but picking any c>0 will improve your odds.

Quote:jfalkI think the answer to your question is: "No." Although if you want to be technical about it, you don't actually need integral calculus, since the same logic applies for Riemann integration...You can't fix x before you make the EV calculation, since x represents a point along the distribution: you have to integrate across all x to solve the problem. If there was a distribution for which density(x) = density(2x) for all x, then that step is unneeded, so you leave it out... that's the "divide by zero" part.

I don't dispute a word you said. However, what if Bill Gates was the host and the first envelope has $100. Do you switch? Since Bill Gates can clearly afford $200, I don't think the impossible integral argument helps much.

I've said this before, but I can't shake the feeling that there is a simpler answer. Suppose somebody asked this before integral calculus and Riemann' rel='nofollow' target='_blank'>http://en.wikipedia.org/wiki/Riemann_integration]Riemann integration was invented. Would they have had to just throw up their hands in frustration? I say no. I still say it is a fundamental abuse by confusing the issue, much like the Missing Dollar problem.

Let me pose this question. Suppose I run a mutual fund. In my prospectus I include an independent actuarial certification that says that my average growth rate per day was 25%, defined as the average of the daily gains. For example, if I made 100% profit on day 1 and lose 50% on day 2, then my average daily gain is 25%. However, I can prove I've maintained this average for years. Do you invest? If not, why not?

To your second question, my answer is the same one the government makes them say: Past returns are no guaranteeof future results. To make this concrete (and give another Wikipedia reference) look up the Peso Problem: http://en.wikipedia.org/wiki/Taleb_distribution

Quote:jfalkAnother way is to to adopt a traditional frequentist perspective: x is the amount of gain for switching. Then when you switch you either gain x (if you got the small envelope) or lose x (if you had the big envelope). Expected gain: 0. Simple enough?

It isn't very satisfying to me to say that the flaw in the 1.25x argument is that you should be asking about the expected additional money, not the expected additional rate. If you try to say that to the layman he will ask why can't I average the possible rates of change.

I think my investment analogy is apropos. Suppose there was a casino game where you doubled your bet 50% of the time, and lost half the other 50%. Indeed, the player advantage would be 25%. However, if the dealer cheated and alternated wins and losses, then the player advantage would be 0%. So a fair game and the rigged game both approach 50% doubles over time, but one returns 125% and the other 100%. Why? I suggest it is because in the cheating game you are always applying the 200% return to small bets, and the -50% to big bets. The same is happening with the envelope game, you're either doubling off the small amount, or halving the big one. For the same reason there is no gain with the cheating dealer, there is no gain with the envelope offer. No calculus required.

Quote:jfalkThere is indeed another perspective which eliminates the need for calculus. Another way is to to adopt a traditional frequentist perspective: x is the amount of gain for switching. Then when you switch you either gain x (if you got the small envelope) or lose x (if you had the big envelope). Expected gain: 0. Simple enough?

No, you gain x, but you lose only x/2.

The amount in the envelope is indeed a random variable. The problem is that it has infinite mean. Apparently, the decision theory breaks down in that case.

If you and I play a coin-flip game, we both know the outcome distribution (heads/tails). Same with dice, same with cards, etc. That's how we can properly compute the EV for those games. For example, suppose I offered you a bet on a coin flip. You bet $100. Heads wins $100, tails loses $50. Do you have the edge? Of course -- and it's the same edge as has been erroneously computed for the value of the other envelope.

But in the envelope problem as stated, nobody's flipping a coin or using some other 50/50 process to determine the value of what's in the other envelope. That value has already been determined in a 100% deterministic manner. You just don't know what it is, nor do you know the probability distributions for those values.

Let's call X the amount in the small envelope. This is very different than calling X the value you get (or I get) in the other envelope examples so far, and which leads to the value of the other envelope being 1.25X. (As above, if you *are* playing a coin-flip game, the other envelope *is* worth 1.25X).

If X is the value in the small envelope, 2X is the value of the large one. There is no X/2 in this setup. You just don't know which envelope is which, so if you have ordered pairs of envelopes, you have either [X, 2X] or [2X, X]. Because you don't know which is which, the EV of both envelopes is 1.5X. When you pick an envelope and reveal $100, that means you're holding either X or 2X, but you *still* don't know which is which. If you switch, you may get $200 or you may get $50, depending on whether $100 = X or $100 = 2X, but it's not a coin-flip at that point since the other value has already been determined. Intuitively turning "I don't know" into a coin-flip is how you get to the improper 1.25 result.

Quote:MathExtremistWhen you pick an envelope and reveal $100, that means you're holding either X or 2X, but you *still* don't know which is which. If you switch, you may get $200 or you may get $50, depending on whether $100 = X or $100 = 2X, but it's not a coin-flip at that point since the other value has already been determined. Intuitively turning "I don't know" into a coin-flip is how you get to the improper 1.25 result.

I'm not saying you're wrong. However, if it isn't 50/50 between $50 and $200, then what are the odds of each. To make the EV of the other envelope equal to $100, then pr(50)=2/3, and pr(200)=1/3. However, that seems arbitrary.

I think you would agree that before you opened your envelope your odds of it being the higher one were 50%. Why is it different after you open it? What is the probability the other envelope is $200 now? Just playing the devil's advocate here, mind you.

Quote:WizardI'm not saying you're wrong. However, if it isn't 50/50 between $50 and $200, then what are the odds of each.

That's precisely the point - it's impossible to know. This isn't a math problem, it's an epistemological one. The problem comes when you try to solve it using math by injecting an assumption (50/50) into the mix. But once you accept that it's impossible to know whether the $100 is the smaller or larger amount, you have no rational basis for switching. Assigning the likelihood of the other envelope holding $200 a 50% probability is improper: the likelihood of the other envelope containing $200 is either 100% or 0%, and you have no other information.

Quote:MathExtremistThat's precisely the point - it's impossible to know.

What if it was possible? Like it was mentioned earlier in this thread (I used to think exactly what you are arguing now, and that argument convinced me that I was wrong), suppose, the envelopes are stuffed with 2^n and 2^(n+1) dollars with the probability of 2^n/3^(n+1). The probability of finding more money in the other envelope is always 3/5, and the expectation of switch is 1.1x.

Quote:weaselmanWhat if it was possible? Like it was mentioned earlier in this thread (I used to think exactly what you are arguing now, and that argument convinced me that I was wrong), suppose, the envelopes are stuffed with 2^n and 2^(n+1) dollars with the probability of 2^n/3^(n+1). The probability of finding more money in the other envelope is always 3/5, and the expectation of switch is 1.1x.

Alternatively, consider the switching decision BEFORE you open the envelope. In that case, the probability of the other one having twice as much as yours is known and it is 50% regardless of the actual original distribution.

Adding additional information as to the distribution of of values changes the situation. In the OP, you don't know the prior probability distribution of values, so the additional knowledge that one envelope contains $100 doesn't give you any way to determine the posterior probability. It certainly doesn't make it 50/50, but that's what the assumption is. The fact that you *don't* know how the initial values were distributed is, in fact, information that you have to consider.

Quote:MathExtremistAdding additional information as to the distribution of of values changes the situation.

Right. I am just saying that it's the cleaner way to state the paradox. In this formulation, you still have to switch regardless of how much you have found in the envelope, and the "unknown probability" explanation does not work here.

Quote:In the OP, you don't know the prior probability distribution of values, so the additional knowledge that one envelope contains $100 doesn't give you any way to determine the posterior probability.

In this case, since you did not gain any additional information, the probability of the outcome cannot change. Since it was 50% before you opened the envelope, it has to stay 50% after the fact as well.

Quote:weaselmanWhat if it was possible? Like it was mentioned earlier in this thread (I used to think exactly what you are arguing now, and that argument convinced me that I was wrong), suppose, the envelopes are stuffed with 2^n and 2^(n+1) dollars with the probability of 2^n/3^(n+1). The probability of finding more money in the other envelope is always 3/5, and the expectation of switch is 1.1x.

I think jfaulk would agree that problem is absurd because it would require an infinite amount of money. However, I can't shake the feeling that we don't have to blame either situation on infinity. Can someone tell me exactly where this train of logic becomes flawed.

- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount.
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope is 50%.
- The odds the other envelope has twice the money of mine is 50%.
- The odds the other envelope has half the money of mine is 50%.
- The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
- The expected money in the other envelope is 1.25 times my envelope.
- I should switch, because the other envelope has 25% more money, on average.

I say you can safely go through step 7. Why you can't jump to 8 is the question.

Quote:WizardI think jfaulk would agree that problem is absurd because it would require an infinite amount of money. However, I can't shake the feeling that we don't have to blame either situation on infinity. Can someone tell me exactly where this train of logic becomes flawed.

- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount.
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope is 50%.
- The odds the other envelope has twice the money of mine is 50%.
- The odds the other envelope has half the money of mine is 50%.
- The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
- The expected money in the other envelope is 1.25 times my envelope.
- I should switch, because the other envelope has 25% more money, on average.

I say you can safely go through step 7. Why you can't jump to 8 is the question.

No, it's step 4 to 5 that's the problem. Once you've chosen the envelope, the chance that the other envelope has twice the money is not 50%. It's either 100% or 0%. This problem is unlike Monte Hall in that revealing an outcome gives you extra information because you know the value of the outcomes by inspection. That is, a goat is worth less than a car, and you can recognize the goat. Here, you don't know whether the value of any particular envelope is high or low, so opening the envelope can't possibly give you more information. The *only* thing that can give you more information is knowledge of how the value in the first amount was selected (distributed).

Quote:MathExtremist

No, it's step 4 to 5 that's the problem. Once you've chosen the envelope, the chance that the other envelope has twice the money is not 50%. It's either 100% or 0%.

In Deal or No Deal let's say there are two suitcases left. One has $1,000,000 and one has $1. What is the probability YOUR suitcase has $1,000,000?

Quote:WizardIn Deal or No Deal let's say there are two suitcases left. One has $1,000,000 and one has $1. What is the probability YOUR suitcase has $1,000,000?

That's a great distinction. In Deal or No Deal, you know (a priori) that the range of values in the N suitcases goes between $1 and $1M. Having eliminated all N-2 other possibilities *randomly*, you know (a posteriori) that the suitcase you're holding has either $1M or $1, and therefore there's a 50% chance. (Monte Hall, on the other hand, does not eliminate the other possibilities randomly in all cases.)

In this envelope case, you do not know the range of values for the envelopes. If you did, you'd have extra information by knowing one was $100.

Let's go back to the original assertion, and the opposite but equally-valid assertion:

1) You opened your envelope and found K. EV(Keep) = K. Then EV(Switch) = (K/2 + 2K)/2 = 1.25K. Or, EV(Switch) = 1.25 * EV(Keep).

2) The other envelope was opened to find S. EV(Switch) = S. Then EV(Keep) = (S/2 + 2S)/2 = 1.25S. Or, EV(Keep) = 1.25 * EV(Switch).

The only situations where both of these can be true is if EV(Switch) = EV(Keep) = 0, or if EV(Switch) = EV(Keep) = infinity. jfalk posted a few days back about how there are no *proper* probability distributions that satisfy this condition which integrate to 1, and you asked whether there was a more intuitive way to think about it that didn't require calculus. I think you get there like this:

1) If the EVs actually are infinite, then it's just fine to say each EV = 1.25 * the other EV. There's no paradox, so let's ignore that scenario.

2) Otherwise, the EVs are finite, which means there is a maximal value for an envelope.

3) Given #2, there is also a half-way point in the distribution of envelope values such that, for any envelope containing one of those values, the other value *cannot* be 2X (because if it were, it would be beyond the maximal value).

4) If you hold an envelope containing a value in the upper half of the distribution, you know that EV(Keep) = X and EV(Switch) = X/2. In other words, you know you hold the higher value, and switching will always cut your amount in half.

5) Similarly, if you hold a value in the lower half of the distribution, it may be the larger of the two values in the envelopes, but it is far more likely than 50% to be the smaller of the two.

6) Therefore, EV(Switch) is *not* always 1.25*EV(Keep). Rather, it depends on EV(Keep).

Using the Deal/No Deal example above, this starts to make intuitive sense if you think of the values in the envelopes ranging between $1 and $1M. If you hold an envelope with $800K, you know for a fact that the other one holds $400K. It doesn't matter *what* the range of values is, just that there is one. That knowledge is sufficient to show that some values of K have an EV(Switch) of exactly K/2, which disproves that EV(Switch) = 1.25K for all K.

QED?

Quote:WizardI think jfaulk would agree that problem is absurd because it would require an infinite amount of money. However, I can't shake the feeling that we don't have to blame either situation on infinity. Can someone tell me exactly where this train of logic becomes flawed.

- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount.
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope is 50%.
- The odds the other envelope has twice the money of mine is 50%.
- The odds the other envelope has half the money of mine is 50%.
- The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
- The expected money in the other envelope is 1.25 times my envelope.
- I should switch, because the other envelope has 25% more money, on average.

I say you can safely go through step 7. Why you can't jump to 8 is the question.

The problem with this logic is in steps 4, 5 & 6.

How can you have a situation where the odds add up to 150% on the initial choice?

The fact is that you have a 50% chance of chosing the envelope with the fixed amount. You only have a 25% chance of chosing an envelope with the larger amount, and a 25% chance of chosing an envelope with the smaller amount.

The 50% odds of chosing either the smaller or larger amount only comes into play IF YOU ASSUME YOU HAVE CHOSEN THE FIXED AMOUNT ENVELOPE. If you make that assumption, then - of course - your EV is 125% for switching.

Here is the explanation I offered earlier:

X = amount in envelope that cannot change

Y = 2*X = possible amount in other envelope (50% probability)

Z = .5X = other possible amount in other envelope (50% probability)

When you open an envelope, you have:

50% chance of finding X

25% chance of finding Y

25% chance of finding Z

As a result, by switching, you have:

25% chance of winding up with: Y - X (you get Y, but give up X)

25% chance of winding up with: Z - X (you get Z, but give up X)

25% chance of winding up with: X - Z (you get X, but give up Z)

25% chance of winding up with: X - Y (you get X, but give up Y)

Total gain/loss = (Y-X) + (Z-X) + (X-Z) + (X-Y) = 0

Quote:WizardSorry, but I'm not buying the argument that the logic breaks down at step 4. I think you guys are over-thinking it. It is not abstract to say that I could put $x and $2x dollars in two envelopes. Maybe x=0.01; I can afford it. You then pick one and you DON'T OPEN IT. It seems obvious to me the probability your envelope is the smaller/larger one is 50%. What if I suddenly tell you the amounts are $1 and $2. Does that suddenly change the odds?

No, but that's an entirely different statement than before. If you tell me the amounts are $1 and $2, then the EV of your unknown envelope is $1.50 and so is the EV of the other unknown envelope. That's not paradoxical at all.

The problem arises when you say "one envelope has X while the other has X/2 or 2X with 50% odds each." That statement is not true when X > X_max/2, and as I showed earlier, X_max (the upper bound) must exist for any finite distribution of X.

Quote:MathExtremistThe problem arises when you say "one envelope has X while the other has X/2 or 2X with 50% odds each." That statement is not true when X > X_max/2, and as I showed earlier, X_max (the upper bound) must exist for any finite distribution of X.

Okay, suppose I have a fair coin and I say "I wrote one positive number on each side. The larger number is twice the smaller number." I then flip the coin. What is the probability it lands on the side with the larger number?

Quote:WizardOkay, suppose I have a fair coin and I say "I wrote one positive number on each side. The larger number is twice the smaller number." I then flip the coin. What is the probability it lands on the side with the larger number?

50%, and it doesn't matter what's written on the coin. But "positive numbers" are infinite...

Quote:MathExtremist50%, and it doesn't matter what's written on the coin. But "positive numbers" are infinite...

Since the "positive numbers" part bothers you, what if I change it to say "I wrote one positive number on each side. The larger number is twice the smaller number. The larger amount is less than the amount of money I own."

Quote:WizardSince the "positive numbers" part bothers you, what if I change it to say "I wrote one positive number on each side. The larger number is twice the smaller number. The larger amount is less than the amount of money I own."

It's a fair coin, so it's still 50% to land on the larger value (on 2X, just for consistency).

Quote:MathExtremistIt's a fair coin, so it's still 50% to land on the larger value (on 2X, just for consistency).

Okay, good, we've come this far. Now suppose instead of writing the amounts on a coin I write them on two pieces of paper. Then I put each paper in an envelope. Then I glue the envelopes together. Then I throw of set of envelopes off the roof of a 12-story building. After landing, what are the odds that the face up envelope has the larger (2x) amount of money inside?

Quote:WizardOkay, good, we've come this far. Now suppose instead of writing the amounts on a coin I write them on two pieces of paper. Then I put each paper in an envelope. Then I glue the envelopes together. Then I throw of set of envelopes off the roof of a 12-story building. After landing, what are the odds that the face up envelope has the larger (2x) amount of money inside?

I'm not disagreeing that there's a 50% chance of picking the envelope with 2X. I'm disagreeing that that means the *other* envelope has an equal chance (50/50) of being half or double the value of the first one. Going back to your list:

is not true for all possible values that can be in your envelope.Quote:Wizard5. The odds the other envelope has twice the money of mine is 50%

Quote:MathExtremistI'm not disagreeing that there's a 50% chance of picking the envelope with 2X. I'm disagreeing that that means the *other* envelope has an equal chance (50/50) of being half or double the value of the first one. Going back to your list: is not true for all possible values that can be in your envelope.

Good, this narrows down the point of departure some more.

So, if you agree there is a 50% chance you chose the larger (2x) envelope before opening it, then would you agree the odds are 50% that you chose the smaller envelope (x) as well?

Of course. Maybe substitute "instead" for "as well", but I know what you meant.Quote:WizardGood, this narrows down the point of departure some more.

So, if you agree there is a 50% chance you chose the larger (2x) envelope before opening it, then would you agree the odds are 50% that you chose the smaller envelope (x) as well?

Quote:MathExtremistOf course. Maybe substitute "instead" for "as well", but I know what you meant.

So, I think we can agree that:

1. The probability your envelope is the big one is 50%.

2. The probability your envelope is the small one is 50%.

3. The large envelope has twice the amount of the small envelope.

Now, can we say that if you chose the smaller envelope than the other one is the larger envelope?

Yes.Quote:WizardSo, I think we can agree that:

1. The probability your envelope is the big one is 50%.

2. The probability your envelope is the small one is 50%.

3. The large envelope has twice the amount of the small envelope.

Now, can we say that if you chose the smaller envelope than the other one is the larger envelope?

In order to simulate the paradox, let's say we have 100 coins. On each coin we write "X" on one side and either "2X" or ".5X" on the other side. To properly simulate the true probabilities, 50 of our coins have "2X" on one side, and 50 of our coins have ".5X" on one side. We then put the coins into a jar, reach in and choose one at random.

The coin represents a random set of two envelopes that are presented to us. We then flip the coin to choose the envelope. The envelope we choose is represented by what comes up on the coin.

There is a 50% chance that we will see an "X" as the result.

There is a 50% chance that we will NOT see an "X". In the event we do not see the "X", then there is a 50% chance we will see ".5X", and a 50% chance we will see "2X".

Therefore, the overall probability of what we will see on the coin is:

50% we will see "X"

25% we will see ".5X"

25% we will see "2X"

Now, what is the expected value of not accepting this result, and taking the value of the other side of the coin?

If you are looking at the "X", then it is true that the EV is 25% of X if you take the other side (in terms of your net gain.) But, this only applies IF THE "X" SIDE IS THE SIDE THAT CAME UP. This is the fundamental source of the flaw; you can't assume that every time you flip the coin that "X" will come up.

If the ".5X" side comes up (25% chance this will happen), then the EV of taking the other side is 50% of X (turning .5X into X).

If the "2X" side comes up (25% chance this will happen), then the EV of taking the other side is -100% of X (turning 2X into X).

So, overall, you have a 50% chance of obtaining an EV of 25%

You have a 25% chance of obtaining an EV of 50%

You have a 25% chance of obtaining an EV of -100%

Total net EV for all cases: (25% * .5) + (50% * .25) + (-100% * .25) = 0% EV

*******************************

Now, let's look at this in terms of the steps outlined by the Wizard:

4) The odds I chose the smaller envelope is 50%.

5) The odds the other envelope has twice the money of mine is 50%.

6) The odds the other envelope has half the money of mine is 50%.

Actually, these are accurate statements. But, if you change the wording to be in line with what follows in the sequence, they are not. The correct wording should be:

4) The odds I chose the smaller envelope is 50%.

5) The odds I CHOSE the envelope with twice the money is 50%

6) The odds I CHOSE the envelope with half the money is 50%

In this context (the correct context), you can't have the odds add up to 150%. The total odds of your choice can only be 100%. The correct assumptions have to be:

4) The odds I chose the smaller envelope is 50%.

5) The odds I CHOSE the envelope with twice the money is 25%

6) The odds I CHOSE the envelope with half the money is 25%

The odds of the other envelope being 50% with twice the money, and 50% with half the money only applies IF the smaller envelope has been chosen, which is only true 50% of the time.

The Flaw is in assumning that you chose the smaller envelope 100% of the time, or in the example of the coin, that when you flip the coin that "X" is always the result that comes up.