MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 6th, 2010 at 10:06:23 AM permalink
Quote: weaselman

What if it was possible? Like it was mentioned earlier in this thread (I used to think exactly what you are arguing now, and that argument convinced me that I was wrong), suppose, the envelopes are stuffed with 2^n and 2^(n+1) dollars with the probability of 2^n/3^(n+1). The probability of finding more money in the other envelope is always 3/5, and the expectation of switch is 1.1x.

Alternatively, consider the switching decision BEFORE you open the envelope. In that case, the probability of the other one having twice as much as yours is known and it is 50% regardless of the actual original distribution.



Adding additional information as to the distribution of of values changes the situation. In the OP, you don't know the prior probability distribution of values, so the additional knowledge that one envelope contains $100 doesn't give you any way to determine the posterior probability. It certainly doesn't make it 50/50, but that's what the assumption is. The fact that you *don't* know how the initial values were distributed is, in fact, information that you have to consider.
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
weaselman
weaselman
Joined: Jul 11, 2010
  • Threads: 20
  • Posts: 2349
September 6th, 2010 at 10:17:01 AM permalink
Quote: MathExtremist

Adding additional information as to the distribution of of values changes the situation.


Right. I am just saying that it's the cleaner way to state the paradox. In this formulation, you still have to switch regardless of how much you have found in the envelope, and the "unknown probability" explanation does not work here.

Quote:

In the OP, you don't know the prior probability distribution of values, so the additional knowledge that one envelope contains $100 doesn't give you any way to determine the posterior probability.


In this case, since you did not gain any additional information, the probability of the outcome cannot change. Since it was 50% before you opened the envelope, it has to stay 50% after the fact as well.
"When two people always agree one of them is unnecessary"
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1390
  • Posts: 23444
September 6th, 2010 at 10:30:50 AM permalink
Quote: weaselman

What if it was possible? Like it was mentioned earlier in this thread (I used to think exactly what you are arguing now, and that argument convinced me that I was wrong), suppose, the envelopes are stuffed with 2^n and 2^(n+1) dollars with the probability of 2^n/3^(n+1). The probability of finding more money in the other envelope is always 3/5, and the expectation of switch is 1.1x.



I think jfaulk would agree that problem is absurd because it would require an infinite amount of money. However, I can't shake the feeling that we don't have to blame either situation on infinity. Can someone tell me exactly where this train of logic becomes flawed.

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount.
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope is 50%.
  5. The odds the other envelope has twice the money of mine is 50%.
  6. The odds the other envelope has half the money of mine is 50%.
  7. The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
  8. The expected money in the other envelope is 1.25 times my envelope.
  9. I should switch, because the other envelope has 25% more money, on average.


I say you can safely go through step 7. Why you can't jump to 8 is the question.
It's not whether you win or lose; it's whether or not you had a good bet.
MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 6th, 2010 at 2:06:35 PM permalink
Quote: Wizard

I think jfaulk would agree that problem is absurd because it would require an infinite amount of money. However, I can't shake the feeling that we don't have to blame either situation on infinity. Can someone tell me exactly where this train of logic becomes flawed.

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount.
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope is 50%.
  5. The odds the other envelope has twice the money of mine is 50%.
  6. The odds the other envelope has half the money of mine is 50%.
  7. The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
  8. The expected money in the other envelope is 1.25 times my envelope.
  9. I should switch, because the other envelope has 25% more money, on average.


I say you can safely go through step 7. Why you can't jump to 8 is the question.



No, it's step 4 to 5 that's the problem. Once you've chosen the envelope, the chance that the other envelope has twice the money is not 50%. It's either 100% or 0%. This problem is unlike Monte Hall in that revealing an outcome gives you extra information because you know the value of the outcomes by inspection. That is, a goat is worth less than a car, and you can recognize the goat. Here, you don't know whether the value of any particular envelope is high or low, so opening the envelope can't possibly give you more information. The *only* thing that can give you more information is knowledge of how the value in the first amount was selected (distributed).
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1390
  • Posts: 23444
September 6th, 2010 at 2:21:45 PM permalink
Quote: MathExtremist


No, it's step 4 to 5 that's the problem. Once you've chosen the envelope, the chance that the other envelope has twice the money is not 50%. It's either 100% or 0%.



In Deal or No Deal let's say there are two suitcases left. One has $1,000,000 and one has $1. What is the probability YOUR suitcase has $1,000,000?
It's not whether you win or lose; it's whether or not you had a good bet.
MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 6th, 2010 at 7:15:50 PM permalink
Quote: Wizard

In Deal or No Deal let's say there are two suitcases left. One has $1,000,000 and one has $1. What is the probability YOUR suitcase has $1,000,000?



That's a great distinction. In Deal or No Deal, you know (a priori) that the range of values in the N suitcases goes between $1 and $1M. Having eliminated all N-2 other possibilities *randomly*, you know (a posteriori) that the suitcase you're holding has either $1M or $1, and therefore there's a 50% chance. (Monte Hall, on the other hand, does not eliminate the other possibilities randomly in all cases.)

In this envelope case, you do not know the range of values for the envelopes. If you did, you'd have extra information by knowing one was $100.

Let's go back to the original assertion, and the opposite but equally-valid assertion:
1) You opened your envelope and found K. EV(Keep) = K. Then EV(Switch) = (K/2 + 2K)/2 = 1.25K. Or, EV(Switch) = 1.25 * EV(Keep).
2) The other envelope was opened to find S. EV(Switch) = S. Then EV(Keep) = (S/2 + 2S)/2 = 1.25S. Or, EV(Keep) = 1.25 * EV(Switch).

The only situations where both of these can be true is if EV(Switch) = EV(Keep) = 0, or if EV(Switch) = EV(Keep) = infinity. jfalk posted a few days back about how there are no *proper* probability distributions that satisfy this condition which integrate to 1, and you asked whether there was a more intuitive way to think about it that didn't require calculus. I think you get there like this:

1) If the EVs actually are infinite, then it's just fine to say each EV = 1.25 * the other EV. There's no paradox, so let's ignore that scenario.
2) Otherwise, the EVs are finite, which means there is a maximal value for an envelope.
3) Given #2, there is also a half-way point in the distribution of envelope values such that, for any envelope containing one of those values, the other value *cannot* be 2X (because if it were, it would be beyond the maximal value).
4) If you hold an envelope containing a value in the upper half of the distribution, you know that EV(Keep) = X and EV(Switch) = X/2. In other words, you know you hold the higher value, and switching will always cut your amount in half.
5) Similarly, if you hold a value in the lower half of the distribution, it may be the larger of the two values in the envelopes, but it is far more likely than 50% to be the smaller of the two.
6) Therefore, EV(Switch) is *not* always 1.25*EV(Keep). Rather, it depends on EV(Keep).

Using the Deal/No Deal example above, this starts to make intuitive sense if you think of the values in the envelopes ranging between $1 and $1M. If you hold an envelope with $800K, you know for a fact that the other one holds $400K. It doesn't matter *what* the range of values is, just that there is one. That knowledge is sufficient to show that some values of K have an EV(Switch) of exactly K/2, which disproves that EV(Switch) = 1.25K for all K.

QED?
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 6th, 2010 at 8:02:32 PM permalink
Quote: Wizard

I think jfaulk would agree that problem is absurd because it would require an infinite amount of money. However, I can't shake the feeling that we don't have to blame either situation on infinity. Can someone tell me exactly where this train of logic becomes flawed.

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount.
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope is 50%.
  5. The odds the other envelope has twice the money of mine is 50%.
  6. The odds the other envelope has half the money of mine is 50%.
  7. The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
  8. The expected money in the other envelope is 1.25 times my envelope.
  9. I should switch, because the other envelope has 25% more money, on average.


I say you can safely go through step 7. Why you can't jump to 8 is the question.



The problem with this logic is in steps 4, 5 & 6.

How can you have a situation where the odds add up to 150% on the initial choice?

The fact is that you have a 50% chance of chosing the envelope with the fixed amount. You only have a 25% chance of chosing an envelope with the larger amount, and a 25% chance of chosing an envelope with the smaller amount.

The 50% odds of chosing either the smaller or larger amount only comes into play IF YOU ASSUME YOU HAVE CHOSEN THE FIXED AMOUNT ENVELOPE. If you make that assumption, then - of course - your EV is 125% for switching.

Here is the explanation I offered earlier:

X = amount in envelope that cannot change
Y = 2*X = possible amount in other envelope (50% probability)
Z = .5X = other possible amount in other envelope (50% probability)

When you open an envelope, you have:

50% chance of finding X
25% chance of finding Y
25% chance of finding Z

As a result, by switching, you have:

25% chance of winding up with: Y - X (you get Y, but give up X)
25% chance of winding up with: Z - X (you get Z, but give up X)
25% chance of winding up with: X - Z (you get X, but give up Z)
25% chance of winding up with: X - Y (you get X, but give up Y)

Total gain/loss = (Y-X) + (Z-X) + (X-Z) + (X-Y) = 0
Prediction is very difficult, especially about the future. - Niels Bohr
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1390
  • Posts: 23444
September 6th, 2010 at 8:13:05 PM permalink
Sorry, but I'm not buying the argument that the logic breaks down at step 4. I think you guys are over-thinking it. It is not abstract to say that I could put $x and $2x dollars in two envelopes. Maybe x=0.01; I can afford it. You then pick one and you DON'T OPEN IT. It seems obvious to me the probability your envelope is the smaller/larger one is 50%. What if I suddenly tell you the amounts are $1 and $2. Does that suddenly change the odds?
It's not whether you win or lose; it's whether or not you had a good bet.
MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 6th, 2010 at 8:24:33 PM permalink
Quote: Wizard

Sorry, but I'm not buying the argument that the logic breaks down at step 4. I think you guys are over-thinking it. It is not abstract to say that I could put $x and $2x dollars in two envelopes. Maybe x=0.01; I can afford it. You then pick one and you DON'T OPEN IT. It seems obvious to me the probability your envelope is the smaller/larger one is 50%. What if I suddenly tell you the amounts are $1 and $2. Does that suddenly change the odds?


No, but that's an entirely different statement than before. If you tell me the amounts are $1 and $2, then the EV of your unknown envelope is $1.50 and so is the EV of the other unknown envelope. That's not paradoxical at all.

The problem arises when you say "one envelope has X while the other has X/2 or 2X with 50% odds each." That statement is not true when X > X_max/2, and as I showed earlier, X_max (the upper bound) must exist for any finite distribution of X.
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1390
  • Posts: 23444
September 6th, 2010 at 9:22:11 PM permalink
Quote: MathExtremist

The problem arises when you say "one envelope has X while the other has X/2 or 2X with 50% odds each." That statement is not true when X > X_max/2, and as I showed earlier, X_max (the upper bound) must exist for any finite distribution of X.



Okay, suppose I have a fair coin and I say "I wrote one positive number on each side. The larger number is twice the smaller number." I then flip the coin. What is the probability it lands on the side with the larger number?
It's not whether you win or lose; it's whether or not you had a good bet.

  • Jump to: