Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1387
  • Posts: 23298
September 7th, 2010 at 6:30:23 AM permalink
Quote: MathExtremist

Yes.



Good. Now we have:

1. The odds your envelope is the smaller one is 50%.
2. If your envelope is the small one, then the other one must be the large one.
3. The large envelope has twice the money as the small one.

Can we put 2 and 3 together to say that if you chose the small envelope, then the other one has twice as much as yours?
It's not whether you win or lose; it's whether or not you had a good bet.
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1387
  • Posts: 23298
September 7th, 2010 at 6:40:43 AM permalink
Quote: scotty81


50% we will see "X"
25% we will see ".5X"
25% we will see "2X"



I agree that is a good way to see that something must be amiss with the 1.25 argument. However, what specifically is wrong with it? That is the tough question.

Quote: scotty81

In this context (the correct context), you can't have the odds add up to 150%. The total odds of your choice can only be 100%. The correct assumptions have to be:



I think you know this, but I was trying to provide a false proof, and ask where it goes wrong. It was not a list of mutually exclusive possibilities.
It's not whether you win or lose; it's whether or not you had a good bet.
weaselman
weaselman
Joined: Jul 11, 2010
  • Threads: 20
  • Posts: 2349
September 7th, 2010 at 6:55:32 AM permalink
There is no mistake in the 50/50 logic. The proof is not "false". This is just a case where the decision theory does not work.
There are many (well, let's say, several) cases like this in math.
Some famous examples are the barber that shaves everybody who does not shave themselves (does he shave himself?), or, equivalently, a set of all sets that do not contain themselves (does it contain itself?), or a set of all possible sets (does it include all the subsets of the set of its own subsets?), or simply a statement "I lie".

The existence of these examples is an illustration of the Gödel's incompleteness theorem - it is not possible to create a complete theory containing elementary arithmetic.
"When two people always agree one of them is unnecessary"
scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 7th, 2010 at 7:08:51 AM permalink
Quote: Wizard

I agree that is a good way to see that something must be amiss with the 1.25 argument. However, what specifically is wrong with it? That is the tough question.



What specifically is wrong is that the 1.25 only applies for one of the envelopes. If you happen to be lucky enough to choose that envelope, then your EV is a gain of 25%.

But, the argument ignores the possibility of chosing the other envelope, which has a total expected gain of -25%.

In fact, the EV is different for each envelope but you are only formulating the argumment for one of them.

You have a 50% chance of chosing the envelope with an EV of 1.25. You also have a 50% chance of chosing the envelope with an EV of only .75.

So, specifically, what is wrong with the 1.25 argument is that it only has a 50% chance of occuring. Equally likely is the possiblity of an EV of only .75. The 1.25 argument ignores the fact that this possibility even exists.

The 1.25 argument ASSUMES that both envelopes have the SAME EV. They don't. That's the flaw.
Prediction is very difficult, especially about the future. - Niels Bohr
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1387
  • Posts: 23298
September 7th, 2010 at 7:28:01 AM permalink
Quote: scotty81

What specifically is wrong is that the 1.25 only applies for one of the envelopes. If you happen to be lucky enough to choose that envelope, then your EV is a gain of 25%.

But, the argument ignores the possibility of chosing the other envelope, which has a total expected gain of -25%.



Sorry, but I'm not following you. I'm not sure where to begin. Perhaps you can explain which envelope has an EV of -25% by switching, and how you arrive at that figure.
It's not whether you win or lose; it's whether or not you had a good bet.
MathExtremist
MathExtremist
Joined: Aug 31, 2010
  • Threads: 88
  • Posts: 6526
September 7th, 2010 at 8:03:28 AM permalink
Quote: Wizard

Good. Now we have:

1. The odds your envelope is the smaller one is 50%.
2. If your envelope is the small one, then the other one must be the large one.
3. The large envelope has twice the money as the small one.

Can we put 2 and 3 together to say that if you chose the small envelope, then the other one has twice as much as yours?


Right, all of this is true.

What we don't know is how the values in the small envelope X are distributed over their range. What's the range of X? Between epsilon and half the money you own, according to your criteria from earlier. So if I pick an envelope with a value outside the range of X, I know it must be the larger of the two envelopes, and then the EV of switching is *not* 1.25X.

What if I don't know the range of X? Well, I still know it's finite, which means that there is at least one value for which EV(switch) < EV(keep). That's sufficient to disprove the unqualified statement that EV(switch) = 1.25*EV(keep).

Also, this entire discussion is based on the unfounded assumption that not only is there no bound on the range of X, but that X is uniformly distributed over that range. If X is *not* uniformly distributed, then there is by definition at least one Xa with different probability than Xb. For the envelope pairs (Xa, 2Xa) and (Xb, 2Xb) where 2Xa = Xb and Xb is the value found in the "keep" envelope, the statement that EV(switch) = 1.25*EV(keep) is also false because p(Xa is smaller value) != p(2Xb is larger value). That is, it's not 50/50.

For example, suppose the distribution of small values X is:
1, with p = 3/4
2, with p = 1/4.
The distribution of envelope-pairs is then
(1, 2), (1, 2), (1, 2), (2, 4).

You are handed an envelope with a 2 in it. Do you switch? EV(keep) = 2. EV(switch) = (1 * 3/4) + (4 * 1/4) = 1.75, so you keep.

The paradox/fallacy is a combination of several assumptions that are being made. Once the assumptions are exposed and set aside, the paradox vanishes.
"In my own case, when it seemed to me after a long illness that death was close at hand, I found no little solace in playing constantly at dice." -- Girolamo Cardano, 1563
scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 7th, 2010 at 9:28:09 AM permalink
Quote: Wizard

Sorry, but I'm not following you. I'm not sure where to begin. Perhaps you can explain which envelope has an EV of -25% by switching, and how you arrive at that figure.



I thought I had, but let me give it another shot.

There is a fundamental difference between the two envelopes. Let's call them envelope A and envelope B. And, let's even ressurect Monte Hall as the guy who puts the contents into the envelopes.

Can we agree that one envelope has X, and the other envelope has .5X or 2X? And that other envelope has a 50% chance of containing .5X and a 50% chance of containing 2X? These are fundamental assumptions which, if not agreed to, makes what follows moot.

Let's now assume that Monte is going to fill the envelopes. He is going to put "X" into envelope A, and either .5X or 2X into envelope B.

Monte's choice of the contents of envelope A is random. He can chose any amount he wants.

However, the contents of envelope B is NOT random. It is dependent upon envelope A. Envelope A can contain anything. Envelope B MUST contain either .5 of A, or 2 times A.

Let's look at the EV for each of these two envelopes.

Envelope A: Envelope A contains X. Switching to Envelope B will get you either .5X, or 2X. The EV gain for switching from Envelope A is 25% of X ((-.5X + 1.0X) / 2). If you know you have selected Envelope A, then you should always switch.

Envelope B: Since there is a dependent relationship between A and B, you can't say that there is a random chance of A containing one value or another. If B contains .5X, then A MUST contain X. If B contains 2X, then A also MUST contain X. Another way to look at this is that if B contains .5X, then there is ZERO chance that A contains .25X (50% of B). If B contains 2X, then there is ZERO chance that A contains 4X.

For envelope B, there is a 50% chance that it will contain .5X, and a 50% chance it will contain 2X. If it contains .5X, then by switching you are trading your .5X for X, thus gaining 50% of X. There is an equally likely chance that envelope B will contain 2X. If you switch in this circumstance, you will be trading 2X for X, thus losing 100% of X.

So, for envelope B, you have a 50% chance of gaining 50% of X, and a 50% chance of losing 100% of X for a "net" expectation of -25% of X. If you know you have envelope B, you should definitely not switch.

Since you don't know if you have selected envelope A or envelope B, and they have equal and offsetting EV's, then it doesn't matter if you switch or not.

Part of the problem is that you can't express the EV for each envelope in terms of its own contents. The EV must be expressed in terms of a constant X across both envelopes. So, even though envelope B contains either .5X or 2X, the EV must be expressed in terms of the expected gain in relation to the contents of envelope A (X). If you try to express the EV of envelope B in terms of the contents of envelope B, then you won't get offsetting percentages.

I don't know any clearer way to present it.
Prediction is very difficult, especially about the future. - Niels Bohr
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1387
  • Posts: 23298
September 7th, 2010 at 9:34:03 AM permalink
Quote: MathExtremist

Right, all of this is true.



So you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount.
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope is 50%.
  5. The odds the other envelope has twice the money of mine is 50%.
  6. The odds the other envelope has half the money of mine is 50%.
  7. The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
  8. The expected money in the other envelope is 1.25 times my envelope.
  9. I should switch, because the other envelope has 25% more money, on average.


I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.
It's not whether you win or lose; it's whether or not you had a good bet.
scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 7th, 2010 at 10:15:08 AM permalink
Quote: Wizard

So you agreed that there is a 50% chance of picking the smaller envelope



No, we are not agreed. Smaller than what?

Smaller than X?

Well, we have a 50% chance of choosing exactly X, and a 25% chance of choosing smaller than X.

Smaller than 2 times X?

Well, we have 75% chance of choosing an envelope smaller than 2 * X.

No. we are not agreed on your basic premise.
Prediction is very difficult, especially about the future. - Niels Bohr
scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
September 7th, 2010 at 10:29:20 AM permalink
Quote: Wizard

So you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:

  1. Suppose there are two envelopes each with an unknown amount of money.
  2. The larger amount is twice the smaller amount.
  3. I pick an envelope, but don't open it.
  4. The odds I chose the smaller envelope is 50%.
  5. The odds the other envelope has twice the money of mine is 50%.
  6. The odds the other envelope has half the money of mine is 50%.
  7. The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
  8. The expected money in the other envelope is 1.25 times my envelope.
  9. I should switch, because the other envelope has 25% more money, on average.


I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.



OK. I see where you are coming from. I'm thinking in terms of X, .5X and 2X and you are thinking solely in terms that one envelope is larger than the other.

Give me a few minutes to formulate a response in those terms.
Prediction is very difficult, especially about the future. - Niels Bohr

  • Jump to: