scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 7th, 2010 at 10:46:35 AM permalink
Quote: Wizard

So you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount.
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope is 50%.
  5. The odds the other envelope has twice the money of mine is 50%.
  6. The odds the other envelope has half the money of mine is 50%.
  7. The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
  8. The expected money in the other envelope is 1.25 times my envelope.
  9. I should switch, because the other envelope has 25% more money, on average.


I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.



Here's another way to look at it.

The problem, as I see it, is the definition of "mine". If you have chosen the smaller envelope, then "mine" equals the smaller amount. If you have chosen the larger envelope, the "mine" equals the amount of the larger envelope.

You cannot then turn around and state the EV in terms of "mine" and then assign the same EV to both envelopes.

The correct logic is:

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount. We call the smaller amount X and the larger amount 2X
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope (X) is 50%.
  5. The odds I chose the larger envelope (2X) is 50%
  6. The odds I chose the smaller envelope (X) and the other envelope has 2X is 50%.
  7. The odds I chose the larger envelope (2X) and the other envelope has X is 50%.
  8. The odds I chose the smaller envelope (X) and the other envelope has .5X is 0%.
  9. The odds I chose the larger envelope (2X) and the other envelope has 4X is 0%.
  10. The expected ratio of the other envelope to my envelope is (.5 * (+1X)) + (.5 * (-1X)) + (.0 * (-.5X)) + (.0 * (+2X)) = 0
  11. The expected money in the other envelope is zero times my envelope.
  12. It then makes no difference if I switch, because the other envelope has 0% more money, on average.
Prediction is very difficult, especially about the future. - Niels Bohr
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1387
  • Posts: 23298
September 7th, 2010 at 10:55:18 AM permalink
Quote: scotty81

No, we are not agreed. Smaller than what?



My comment, "So you agreed that there is a 50% chance of picking the smaller envelope" was made to MathExtremist, not you. I think the conversations are getting confused.
It's not whether you win or lose; it's whether or not you had a good bet.
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1387
  • Posts: 23298
September 7th, 2010 at 11:00:33 AM permalink
Quote: scotty81

Here's another way to look at it.



I agree, that is the correct way to look at it. I mentioned this correct way, in other words, in my Ask the Wizard answer.

What is hard to explain is how can the expected gain on a percentage basis (relative to the first envelope) be 25%, but the expected gain on an absolute basis be 0?

By the way, if we can get to 42 pages, this will become the longest thread in the forum.
It's not whether you win or lose; it's whether or not you had a good bet.
scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 7th, 2010 at 11:14:53 AM permalink
Quote: Wizard

I agree, that is the correct way to look at it. I mentioned this correct way, in other words, in my Ask the Wizard answer.

What is hard to explain is how can the expected gain on a percentage basis (relative to the first envelope) be 25%, but the expected gain on an absolute basis be 0?

By the way, if we can get to 42 pages, this will become the longest thread in the forum.



It's not hard to explain at all. Look at what you have said. The expected gain - RELATIVE TO THE FIRST ENVELOPE is 25%.

You are then assuming that the expected gain - RELATIVE TO THE SECOND ENVELOPE is also 25%. You are assuming the same EV for both envelopes.

That simply isn't the case.

Look at it in these terms:

The odds you chose the smaller envelope (X) and the other envelope has 2X is 50%.
The odds you chose the smaller envelope (X) and the other envelope has .5X is 0%.

EV in terms of the contents of the smaller envelope is +100% of X.

The odds you chose the larger envelope (2X) and the other envelope has X is 50%.
The odds you chose the larger envelope (2X) and the other envelope has 4X is 0%.

EV in terms of the contents of the larger envelope is -50% of 2X.

You can't then average +100 and -50 and divide by two. They are apples and oranges.

However.....

The EV of the larger envelope in terms of the contents of the smaller envelope is -100%.

Now, you can average the two percentages. ((+100%) + (-100%)) / 2 = zero.
Prediction is very difficult, especially about the future. - Niels Bohr
MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 7th, 2010 at 12:12:13 PM permalink
Quote: Wizard

So you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount.
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope is 50%.
  5. The odds the other envelope has twice the money of mine is 50%.
  6. The odds the other envelope has half the money of mine is 50%.
  7. The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
  8. The expected money in the other envelope is 1.25 times my envelope.
  9. I should switch, because the other envelope has 25% more money, on average.


I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.


"the money of mine" is effectively equivalent to opening the envelope and finding a value in it. It's a variable. Let's call it V, for value. Let's also be consistent and say the small value is X, since that's what I've been using so far.

P(L) is the probability of picking the large envelope, and P(S) is the probability of picking the small one. From step 4, these are both 50%.

But then you pick an envelope with value v. Now, when you compute the expectation in step 7, you're actually using P(L|V=v) and P(S|V=v). Those aren't the same conceptually as P(L) and P(S). They're also not 50% each:

Bayes says P(z|V=v) = P(V=v|z) * P(z) / P(V=v), where z is either S or L depending on whether you're talking about the small or large envelope. P(V=v|S), the chances of picking value V if you know V is small, is the same as P(X=v) -- I defined X to be the small value above. Similarly the chances of picking value V if you know V is large, P(V=v|L), are the same as the small value X being v/2, P(X=v/2). So P(L|V=v) = P(X=v/2) * P(L) / P(V=v).

So P(S|V=v) = P(X) * P(S) / P(V). P(S) is 50% from above. But P(S|V=v) can only be equal to P(S) (and 50%) if P(X=v)/P(V=v) = 1, or in other words P(X=v) and P(V=v) are equal. Let's assume they are:

P(L|V=v) + P(S|V=v) = 1, which means
(P(X=v/2) * P(L) / P(V=v)) + (P(X=v) * P(S) / P(V=v)) = 1. Multiply through and get
P(X=v/2)*P(L) + P(X=v)*P(S) = P(V=v), and substituting:
P(X=v/2)*50% + P(X=v)*50% = P(V=v). Under our assumption, P(V=v) = P(X=v). That would mean:
P(X=v/2)*50% + P(X=v)*50% = P(X=v). In other words, P(X=v) has to be equal to P(X=v/2). In English, this basically means the probability of any value in the small envelope is equal to the probability of any other value. This can only happen if you have an infinite distribution, like we discussed before. And there's no such thing.

Therefore, P(V=v) cannot be equal to P(X=v). Therefore, P(S|V=v) does not equal 50%, and Steps 5/6 are false.

Q.E.D.

The real EV is computed as P(L|V=v)*v/2 + P(S|V=v)*2v, but you don't know what those posterior probabilities actually are so you don't have sufficient information to figure EV.
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 7th, 2010 at 12:34:56 PM permalink
Quote: scotty81

Quote: Wizard

So you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount.
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope is 50%.
  5. The odds the other envelope has twice the money of mine is 50%.
  6. The odds the other envelope has half the money of mine is 50%.
  7. The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
  8. The expected money in the other envelope is 1.25 times my envelope.
  9. I should switch, because the other envelope has 25% more money, on average.


I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.



Here's another way to look at it.

The problem, as I see it, is the definition of "mine". If you have chosen the smaller envelope, then "mine" equals the smaller amount. If you have chosen the larger envelope, the "mine" equals the amount of the larger envelope.

You cannot then turn around and state the EV in terms of "mine" and then assign the same EV to both envelopes.

The correct logic is:

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount. We call the smaller amount X and the larger amount 2X
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope (X) is 50%.
  5. The odds I chose the larger envelope (2X) is 50%
  6. The odds I chose the smaller envelope (X) and the other envelope has 2X is 50%.
  7. The odds I chose the larger envelope (2X) and the other envelope has X is 50%.
  8. The odds I chose the smaller envelope (X) and the other envelope has .5X is 0%.
  9. The odds I chose the larger envelope (2X) and the other envelope has 4X is 0%.
  10. The expected ratio of the other envelope to my envelope is (.5 * (+1X)) + (.5 * (-1X)) + (.0 * (-.5X)) + (.0 * (+2X)) = 0
  11. The expected money in the other envelope is zero times my envelope.
  12. It then makes no difference if I switch, because the other envelope has 0% more money, on average.




At a more basic level, the distribution of values of "mine" isn't uniform, even if the underlying distribution of *small* values is. Consider a small set of envelopes: (1, 2), (2, 4), (3, 6), (4, 8), (5, 10), (6, 12). Small values are uniformly distributed, but the total values aren't. Only 1/4 of the total values you could pick are above the midpoint. That alone should tell you that the chances of picking a particular value, and then having another be higher/lower than it, aren't 50/50.

Maybe it's as simple as suggesting that, since X is the small value and you don't know what it is, 50% of the time you have small and you gain X by switching, while 50% of the time have large so you lose X by switching, and overall it's a wash. That's basically what your #10 says, just with fewer lines. :)

The fallacy/paradox comes in when you suggest that 50% of the time you gain Y (where Y isn't the *small* value but the value in your envelope), but the other 50% of the time you only lose Y/2. If that's true, you *do* have the edge.
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 7th, 2010 at 12:44:06 PM permalink
Quote: MathExtremist

Maybe it's as simple as suggesting that, since X is the small value and you don't know what it is, 50% of the time you have small and you gain X by switching, while 50% of the time have large so you lose X by switching, and overall it's a wash. That's basically what your #10 says, just with fewer lines. :)

The fallacy/paradox comes in when you suggest that 50% of the time you gain Y (where Y isn't the *small* value but the value in your envelope), but the other 50% of the time you only lose Y/2. If that's true, you *do* have the edge.



Well said.

The gain/loss must be expressed in constant terms across both envelopes, not in terms of their individual contents.
Prediction is very difficult, especially about the future. - Niels Bohr
MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 7th, 2010 at 12:46:01 PM permalink
Here's looking at it in reverse. We know (intuitively and logically) that the EV of switching or keeping has to be the same. That means if I pick an envelope with value V, I can keep it and have V or switch and have EV(switch) = V = A*2V + B*V/2, where A+B = 1. That leads to A = 1/3, B = 2/3. In short, the chances of switching to a higher envelope is only 1/3. The question, dear reader, is why?
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 7th, 2010 at 2:44:32 PM permalink
Quote: MathExtremist

Bayes says ...[snip]


Let me put this into English, or at least try. The assertion in Wizard's step 4 is that, not knowing anything, picking the larger or the smaller of two envelopes is a 50/50 chance. The assertions in steps 5/6, on the other hand, say that once you've picked a value for an envelope, the chances that the other envelope is larger or smaller than your envelope is also a 50/50 chance. What the conditional probability calculations demonstrate is that the only way steps 5/6 *can* be a 50/50 chance is if the probability of every single possible value for the envelopes is equal. The only way *that* happens is if the distribution of those values is both equiprobable and infinite, which can't ever happen (the total probabilities will sum to infinity, not 1).

Since the distribution, whatever it is, can't be infinite, then the chances of having the other envelope be higher or lower than your envelope cannot be 50/50. Therefore, steps 5/6 are wrong. However, 50/50 is what was used in the EV equation of step 7, so that's wrong too.
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 7th, 2010 at 3:36:27 PM permalink
This problem is, in fact, a perfect application of conditional probability. The probability of what is in the 2nd envelope is conditional on what you find in the 1st envelope. The Envelopes do not represent independent events. Unlike Roulette where conditional probability has no place since every outcome is an independent event.

It sucks that this didn't occur to me (to put it in terms of conditional probability).

I guess that's why my Dad was the Math professor of the family (taught college level math for 30 years), and I was just the jerk off son.
Prediction is very difficult, especially about the future. - Niels Bohr

  • Jump to: