Quote: MathExtremistYes.
Good. Now we have:
1. The odds your envelope is the smaller one is 50%.
2. If your envelope is the small one, then the other one must be the large one.
3. The large envelope has twice the money as the small one.
Can we put 2 and 3 together to say that if you chose the small envelope, then the other one has twice as much as yours?
Quote: scotty81
50% we will see "X"
25% we will see ".5X"
25% we will see "2X"
I agree that is a good way to see that something must be amiss with the 1.25 argument. However, what specifically is wrong with it? That is the tough question.
Quote: scotty81In this context (the correct context), you can't have the odds add up to 150%. The total odds of your choice can only be 100%. The correct assumptions have to be:
I think you know this, but I was trying to provide a false proof, and ask where it goes wrong. It was not a list of mutually exclusive possibilities.
There are many (well, let's say, several) cases like this in math.
Some famous examples are the barber that shaves everybody who does not shave themselves (does he shave himself?), or, equivalently, a set of all sets that do not contain themselves (does it contain itself?), or a set of all possible sets (does it include all the subsets of the set of its own subsets?), or simply a statement "I lie".
The existence of these examples is an illustration of the Gödel's incompleteness theorem - it is not possible to create a complete theory containing elementary arithmetic.
Quote: WizardI agree that is a good way to see that something must be amiss with the 1.25 argument. However, what specifically is wrong with it? That is the tough question.
What specifically is wrong is that the 1.25 only applies for one of the envelopes. If you happen to be lucky enough to choose that envelope, then your EV is a gain of 25%.
But, the argument ignores the possibility of chosing the other envelope, which has a total expected gain of -25%.
In fact, the EV is different for each envelope but you are only formulating the argumment for one of them.
You have a 50% chance of chosing the envelope with an EV of 1.25. You also have a 50% chance of chosing the envelope with an EV of only .75.
So, specifically, what is wrong with the 1.25 argument is that it only has a 50% chance of occuring. Equally likely is the possiblity of an EV of only .75. The 1.25 argument ignores the fact that this possibility even exists.
The 1.25 argument ASSUMES that both envelopes have the SAME EV. They don't. That's the flaw.
Quote: scotty81What specifically is wrong is that the 1.25 only applies for one of the envelopes. If you happen to be lucky enough to choose that envelope, then your EV is a gain of 25%.
But, the argument ignores the possibility of chosing the other envelope, which has a total expected gain of -25%.
Sorry, but I'm not following you. I'm not sure where to begin. Perhaps you can explain which envelope has an EV of -25% by switching, and how you arrive at that figure.
Quote: WizardGood. Now we have:
1. The odds your envelope is the smaller one is 50%.
2. If your envelope is the small one, then the other one must be the large one.
3. The large envelope has twice the money as the small one.
Can we put 2 and 3 together to say that if you chose the small envelope, then the other one has twice as much as yours?
Right, all of this is true.
What we don't know is how the values in the small envelope X are distributed over their range. What's the range of X? Between epsilon and half the money you own, according to your criteria from earlier. So if I pick an envelope with a value outside the range of X, I know it must be the larger of the two envelopes, and then the EV of switching is *not* 1.25X.
What if I don't know the range of X? Well, I still know it's finite, which means that there is at least one value for which EV(switch) < EV(keep). That's sufficient to disprove the unqualified statement that EV(switch) = 1.25*EV(keep).
Also, this entire discussion is based on the unfounded assumption that not only is there no bound on the range of X, but that X is uniformly distributed over that range. If X is *not* uniformly distributed, then there is by definition at least one Xa with different probability than Xb. For the envelope pairs (Xa, 2Xa) and (Xb, 2Xb) where 2Xa = Xb and Xb is the value found in the "keep" envelope, the statement that EV(switch) = 1.25*EV(keep) is also false because p(Xa is smaller value) != p(2Xb is larger value). That is, it's not 50/50.
For example, suppose the distribution of small values X is:
1, with p = 3/4
2, with p = 1/4.
The distribution of envelope-pairs is then
(1, 2), (1, 2), (1, 2), (2, 4).
You are handed an envelope with a 2 in it. Do you switch? EV(keep) = 2. EV(switch) = (1 * 3/4) + (4 * 1/4) = 1.75, so you keep.
The paradox/fallacy is a combination of several assumptions that are being made. Once the assumptions are exposed and set aside, the paradox vanishes.
Quote: WizardSorry, but I'm not following you. I'm not sure where to begin. Perhaps you can explain which envelope has an EV of -25% by switching, and how you arrive at that figure.
I thought I had, but let me give it another shot.
There is a fundamental difference between the two envelopes. Let's call them envelope A and envelope B. And, let's even ressurect Monte Hall as the guy who puts the contents into the envelopes.
Can we agree that one envelope has X, and the other envelope has .5X or 2X? And that other envelope has a 50% chance of containing .5X and a 50% chance of containing 2X? These are fundamental assumptions which, if not agreed to, makes what follows moot.
Let's now assume that Monte is going to fill the envelopes. He is going to put "X" into envelope A, and either .5X or 2X into envelope B.
Monte's choice of the contents of envelope A is random. He can chose any amount he wants.
However, the contents of envelope B is NOT random. It is dependent upon envelope A. Envelope A can contain anything. Envelope B MUST contain either .5 of A, or 2 times A.
Let's look at the EV for each of these two envelopes.
Envelope A: Envelope A contains X. Switching to Envelope B will get you either .5X, or 2X. The EV gain for switching from Envelope A is 25% of X ((-.5X + 1.0X) / 2). If you know you have selected Envelope A, then you should always switch.
Envelope B: Since there is a dependent relationship between A and B, you can't say that there is a random chance of A containing one value or another. If B contains .5X, then A MUST contain X. If B contains 2X, then A also MUST contain X. Another way to look at this is that if B contains .5X, then there is ZERO chance that A contains .25X (50% of B). If B contains 2X, then there is ZERO chance that A contains 4X.
For envelope B, there is a 50% chance that it will contain .5X, and a 50% chance it will contain 2X. If it contains .5X, then by switching you are trading your .5X for X, thus gaining 50% of X. There is an equally likely chance that envelope B will contain 2X. If you switch in this circumstance, you will be trading 2X for X, thus losing 100% of X.
So, for envelope B, you have a 50% chance of gaining 50% of X, and a 50% chance of losing 100% of X for a "net" expectation of -25% of X. If you know you have envelope B, you should definitely not switch.
Since you don't know if you have selected envelope A or envelope B, and they have equal and offsetting EV's, then it doesn't matter if you switch or not.
Part of the problem is that you can't express the EV for each envelope in terms of its own contents. The EV must be expressed in terms of a constant X across both envelopes. So, even though envelope B contains either .5X or 2X, the EV must be expressed in terms of the expected gain in relation to the contents of envelope A (X). If you try to express the EV of envelope B in terms of the contents of envelope B, then you won't get offsetting percentages.
I don't know any clearer way to present it.
Quote: MathExtremistRight, all of this is true.
So you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:
- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount.
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope is 50%.
- The odds the other envelope has twice the money of mine is 50%.
- The odds the other envelope has half the money of mine is 50%.
- The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
- The expected money in the other envelope is 1.25 times my envelope.
- I should switch, because the other envelope has 25% more money, on average.
I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.
Quote: WizardSo you agreed that there is a 50% chance of picking the smaller envelope
No, we are not agreed. Smaller than what?
Smaller than X?
Well, we have a 50% chance of choosing exactly X, and a 25% chance of choosing smaller than X.
Smaller than 2 times X?
Well, we have 75% chance of choosing an envelope smaller than 2 * X.
No. we are not agreed on your basic premise.
Quote: WizardSo you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:
- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount.
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope is 50%.
- The odds the other envelope has twice the money of mine is 50%.
- The odds the other envelope has half the money of mine is 50%.
- The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
- The expected money in the other envelope is 1.25 times my envelope.
- I should switch, because the other envelope has 25% more money, on average.
I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.
OK. I see where you are coming from. I'm thinking in terms of X, .5X and 2X and you are thinking solely in terms that one envelope is larger than the other.
Give me a few minutes to formulate a response in those terms.
Quote: WizardSo you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:
- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount.
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope is 50%.
- The odds the other envelope has twice the money of mine is 50%.
- The odds the other envelope has half the money of mine is 50%.
- The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
- The expected money in the other envelope is 1.25 times my envelope.
- I should switch, because the other envelope has 25% more money, on average.
I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.
Here's another way to look at it.
The problem, as I see it, is the definition of "mine". If you have chosen the smaller envelope, then "mine" equals the smaller amount. If you have chosen the larger envelope, the "mine" equals the amount of the larger envelope.
You cannot then turn around and state the EV in terms of "mine" and then assign the same EV to both envelopes.
The correct logic is:
- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount. We call the smaller amount X and the larger amount 2X
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope (X) is 50%.
- The odds I chose the larger envelope (2X) is 50%
- The odds I chose the smaller envelope (X) and the other envelope has 2X is 50%.
- The odds I chose the larger envelope (2X) and the other envelope has X is 50%.
- The odds I chose the smaller envelope (X) and the other envelope has .5X is 0%.
- The odds I chose the larger envelope (2X) and the other envelope has 4X is 0%.
- The expected ratio of the other envelope to my envelope is (.5 * (+1X)) + (.5 * (-1X)) + (.0 * (-.5X)) + (.0 * (+2X)) = 0
- The expected money in the other envelope is zero times my envelope.
- It then makes no difference if I switch, because the other envelope has 0% more money, on average.
Quote: scotty81No, we are not agreed. Smaller than what?
My comment, "So you agreed that there is a 50% chance of picking the smaller envelope" was made to MathExtremist, not you. I think the conversations are getting confused.
Quote: scotty81Here's another way to look at it.
I agree, that is the correct way to look at it. I mentioned this correct way, in other words, in my Ask the Wizard answer.
What is hard to explain is how can the expected gain on a percentage basis (relative to the first envelope) be 25%, but the expected gain on an absolute basis be 0?
By the way, if we can get to 42 pages, this will become the longest thread in the forum.
Quote: WizardI agree, that is the correct way to look at it. I mentioned this correct way, in other words, in my Ask the Wizard answer.
What is hard to explain is how can the expected gain on a percentage basis (relative to the first envelope) be 25%, but the expected gain on an absolute basis be 0?
By the way, if we can get to 42 pages, this will become the longest thread in the forum.
It's not hard to explain at all. Look at what you have said. The expected gain - RELATIVE TO THE FIRST ENVELOPE is 25%.
You are then assuming that the expected gain - RELATIVE TO THE SECOND ENVELOPE is also 25%. You are assuming the same EV for both envelopes.
That simply isn't the case.
Look at it in these terms:
The odds you chose the smaller envelope (X) and the other envelope has 2X is 50%.
The odds you chose the smaller envelope (X) and the other envelope has .5X is 0%.
EV in terms of the contents of the smaller envelope is +100% of X.
The odds you chose the larger envelope (2X) and the other envelope has X is 50%.
The odds you chose the larger envelope (2X) and the other envelope has 4X is 0%.
EV in terms of the contents of the larger envelope is -50% of 2X.
You can't then average +100 and -50 and divide by two. They are apples and oranges.
However.....
The EV of the larger envelope in terms of the contents of the smaller envelope is -100%.
Now, you can average the two percentages. ((+100%) + (-100%)) / 2 = zero.
Quote: WizardSo you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:
- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount.
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope is 50%.
- The odds the other envelope has twice the money of mine is 50%.
- The odds the other envelope has half the money of mine is 50%.
- The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
- The expected money in the other envelope is 1.25 times my envelope.
- I should switch, because the other envelope has 25% more money, on average.
I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.
"the money of mine" is effectively equivalent to opening the envelope and finding a value in it. It's a variable. Let's call it V, for value. Let's also be consistent and say the small value is X, since that's what I've been using so far.
P(L) is the probability of picking the large envelope, and P(S) is the probability of picking the small one. From step 4, these are both 50%.
But then you pick an envelope with value v. Now, when you compute the expectation in step 7, you're actually using P(L|V=v) and P(S|V=v). Those aren't the same conceptually as P(L) and P(S). They're also not 50% each:
Bayes says P(z|V=v) = P(V=v|z) * P(z) / P(V=v), where z is either S or L depending on whether you're talking about the small or large envelope. P(V=v|S), the chances of picking value V if you know V is small, is the same as P(X=v) -- I defined X to be the small value above. Similarly the chances of picking value V if you know V is large, P(V=v|L), are the same as the small value X being v/2, P(X=v/2). So P(L|V=v) = P(X=v/2) * P(L) / P(V=v).
So P(S|V=v) = P(X) * P(S) / P(V). P(S) is 50% from above. But P(S|V=v) can only be equal to P(S) (and 50%) if P(X=v)/P(V=v) = 1, or in other words P(X=v) and P(V=v) are equal. Let's assume they are:
P(L|V=v) + P(S|V=v) = 1, which means
(P(X=v/2) * P(L) / P(V=v)) + (P(X=v) * P(S) / P(V=v)) = 1. Multiply through and get
P(X=v/2)*P(L) + P(X=v)*P(S) = P(V=v), and substituting:
P(X=v/2)*50% + P(X=v)*50% = P(V=v). Under our assumption, P(V=v) = P(X=v). That would mean:
P(X=v/2)*50% + P(X=v)*50% = P(X=v). In other words, P(X=v) has to be equal to P(X=v/2). In English, this basically means the probability of any value in the small envelope is equal to the probability of any other value. This can only happen if you have an infinite distribution, like we discussed before. And there's no such thing.
Therefore, P(V=v) cannot be equal to P(X=v). Therefore, P(S|V=v) does not equal 50%, and Steps 5/6 are false.
Q.E.D.
The real EV is computed as P(L|V=v)*v/2 + P(S|V=v)*2v, but you don't know what those posterior probabilities actually are so you don't have sufficient information to figure EV.
Quote: scotty81Quote: WizardSo you agreed that there is a 50% chance of picking the smaller envelope, and if you did then the other one must have two times as much. So why do you say my "proof" falls apart at step 5? Here are the steps again:
- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount.
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope is 50%.
- The odds the other envelope has twice the money of mine is 50%.
- The odds the other envelope has half the money of mine is 50%.
- The expected ratio of the other envelope to my envelope is .5*2 + 0.5*0.5 = 1.25.
- The expected money in the other envelope is 1.25 times my envelope.
- I should switch, because the other envelope has 25% more money, on average.
I don't disagree with anything you just wrote. However, I do disagree that the proof breaks down at step 5. If A implies B, and B implies C, then A implies C.
Here's another way to look at it.
The problem, as I see it, is the definition of "mine". If you have chosen the smaller envelope, then "mine" equals the smaller amount. If you have chosen the larger envelope, the "mine" equals the amount of the larger envelope.
You cannot then turn around and state the EV in terms of "mine" and then assign the same EV to both envelopes.
The correct logic is:
- Suppose there are two envelopes each with an unknown amount of money.
- The larger amount is twice the smaller amount. We call the smaller amount X and the larger amount 2X
- I pick an envelope, but don't open it.
- The odds I chose the smaller envelope (X) is 50%.
- The odds I chose the larger envelope (2X) is 50%
- The odds I chose the smaller envelope (X) and the other envelope has 2X is 50%.
- The odds I chose the larger envelope (2X) and the other envelope has X is 50%.
- The odds I chose the smaller envelope (X) and the other envelope has .5X is 0%.
- The odds I chose the larger envelope (2X) and the other envelope has 4X is 0%.
- The expected ratio of the other envelope to my envelope is (.5 * (+1X)) + (.5 * (-1X)) + (.0 * (-.5X)) + (.0 * (+2X)) = 0
- The expected money in the other envelope is zero times my envelope.
- It then makes no difference if I switch, because the other envelope has 0% more money, on average.
At a more basic level, the distribution of values of "mine" isn't uniform, even if the underlying distribution of *small* values is. Consider a small set of envelopes: (1, 2), (2, 4), (3, 6), (4, 8), (5, 10), (6, 12). Small values are uniformly distributed, but the total values aren't. Only 1/4 of the total values you could pick are above the midpoint. That alone should tell you that the chances of picking a particular value, and then having another be higher/lower than it, aren't 50/50.
Maybe it's as simple as suggesting that, since X is the small value and you don't know what it is, 50% of the time you have small and you gain X by switching, while 50% of the time have large so you lose X by switching, and overall it's a wash. That's basically what your #10 says, just with fewer lines. :)
The fallacy/paradox comes in when you suggest that 50% of the time you gain Y (where Y isn't the *small* value but the value in your envelope), but the other 50% of the time you only lose Y/2. If that's true, you *do* have the edge.
Quote: MathExtremistMaybe it's as simple as suggesting that, since X is the small value and you don't know what it is, 50% of the time you have small and you gain X by switching, while 50% of the time have large so you lose X by switching, and overall it's a wash. That's basically what your #10 says, just with fewer lines. :)
The fallacy/paradox comes in when you suggest that 50% of the time you gain Y (where Y isn't the *small* value but the value in your envelope), but the other 50% of the time you only lose Y/2. If that's true, you *do* have the edge.
Well said.
The gain/loss must be expressed in constant terms across both envelopes, not in terms of their individual contents.
Quote: MathExtremistBayes says ...[snip]
Let me put this into English, or at least try. The assertion in Wizard's step 4 is that, not knowing anything, picking the larger or the smaller of two envelopes is a 50/50 chance. The assertions in steps 5/6, on the other hand, say that once you've picked a value for an envelope, the chances that the other envelope is larger or smaller than your envelope is also a 50/50 chance. What the conditional probability calculations demonstrate is that the only way steps 5/6 *can* be a 50/50 chance is if the probability of every single possible value for the envelopes is equal. The only way *that* happens is if the distribution of those values is both equiprobable and infinite, which can't ever happen (the total probabilities will sum to infinity, not 1).
Since the distribution, whatever it is, can't be infinite, then the chances of having the other envelope be higher or lower than your envelope cannot be 50/50. Therefore, steps 5/6 are wrong. However, 50/50 is what was used in the EV equation of step 7, so that's wrong too.
It sucks that this didn't occur to me (to put it in terms of conditional probability).
I guess that's why my Dad was the Math professor of the family (taught college level math for 30 years), and I was just the jerk off son.
The real key here is getting past the coin-flip analogy and the intuition that goes with it. Obviously before you flip a coin, the chances of heads/tails are 50/50. But then after you flip it, if the side facing up is an unknown X, the other side is still 50/50 heads/tails. That's not true with the envelopes. Before you pick, the chances of small or large are 50/50. But once you pick an envelope, the chances that the *other* envelope is smaller/larger *than that envelope* are *not* 50/50. That's the assumption that leads to the paradox. The reason it's different is that the range of values for coins is bounded at heads or tails, while the range of values for the envelopes is anything.
Suppose you considered heads = 1 and tails = 2. Then the coin has the same property as the envelopes. However, you know what the coin's range is, so if you get a 2 you keep it and if you get a 1 you switch. For the envelopes the range is infinite, so assuming the envelopes behave like the coin (when they don't) gets you to the improper 50/50 figures and the paradox.
Quote: WizardBy the way, if we can get to 42 pages, this will become the longest thread in the forum.
I love it!
Quote: WizardI agree, that is the correct way to look at it. I mentioned this correct way, in other words, in my Ask the Wizard answer.
What is hard to explain is how can the expected gain on a percentage basis (relative to the first envelope) be 25%, but the expected gain on an absolute basis be 0?
By the way, if we can get to 42 pages, this will become the longest thread in the forum.
That's because this is a simple, as in, simplistic problem that is obfuscated by mathematics, when mathematics actually has no place in the discussion. The attempt to bring it in produces a swirling effect where abstractions are batted about, to no purposeful conclusion.
This whole thread seems like a bunch of scientists standing in a circle on a sidewalk, holding a raw egg. They are engaging in a series of discussions about terminal velocity, thickness of the shell, Mohs hardness scale of the concrete, approximate mass/acceleration at 1g of the egg, wind direction and velocity, and the proper height from which to drop the egg. This has consumed hours, when a small child pedals up on a bicycle and inquires what the scientists are doing. "We're trying to figure out what happens when you drop an egg on the sidewalk," one of them replies. The child grabs the egg and drops it on the sidewalk. It breaks. The child shrugs and pedals away. The scientists go back to the university to write research papers.
Chosen Envelope x
Unknown Envelope (State 1)
Probability of State
0.5
Value in Other Envelope
x/2
Weighted Value of Other Envelope
0.5*x/2
Value in all envelopes
1.5x
Equivalent Magnitude of Value Once established
3x /1.5x = 2
Unknown Envelope (State 2)
Probability of State
0.5
Value in Other Envelope
x*2
Weighted Value of Other Envelope
0.5*x*2
Value in all envelopes
3x
Equivalent Magnitude of Value Once established
1.5x / 3x = 0.5
The value of choosing the other envelope should equal to x and both possibilities should contribute.
The value of each possibility should be the weighted value * the Equivalent Magnitude.
If x is the high envelope (State 1), then 0.5*x/2*2 = 0.5x
If x is the low envelope(State 2), , then 0.5*x*2*0.5 = 0.5x
The sum of all values should then be x.
Quote: mkl654321That's because this is a simple, as in, simplistic problem that is obfuscated by mathematics, when mathematics actually has no place in the discussion. The attempt to bring it in produces a swirling effect where abstractions are batted about, to no purposeful conclusion.
This whole thread seems like a bunch of scientists standing in a circle on a sidewalk, holding a raw egg. They are engaging in a series of discussions about terminal velocity, thickness of the shell, Mohs hardness scale of the concrete, approximate mass/acceleration at 1g of the egg, wind direction and velocity, and the proper height from which to drop the egg. This has consumed hours, when a small child pedals up on a bicycle and inquires what the scientists are doing. "We're trying to figure out what happens when you drop an egg on the sidewalk," one of them replies. The child grabs the egg and drops it on the sidewalk. It breaks. The child shrugs and pedals away. The scientists go back to the university to write research papers.
An amusing story to be sure, but people had been throwing dice for hundreds of years before Pascal and Fermat asked "wait a minute, why do I win making one wager but lose making the other?" It is precisely that sort of question that mathematics answers (and indeed, is often invented to answer). The entirety of the gambling industry owes its very existence to mathematics entering the discussion.
Many times the interesting question isn't what happens but why.
Quote: MathExtremistAn amusing story to be sure, but people had been throwing dice for hundreds of years before Pascal and Fermat asked "wait a minute, why do I win making one wager but lose making the other?" It is precisely that sort of question that mathematics answers (and indeed, is often invented to answer). The entirety of the gambling industry owes its very existence to mathematics entering the discussion.
Many times the interesting question isn't what happens but why.
I don't disagree, but my point was that the "paradox" being discussed in this thread is due to the failure of mathematical terminology. The child knows that the egg will break. Likewise, the child knows that switching envelopes back and forth can't possibly increase the amount in either envelope.
In other words, there are times when mathematics fails to solve a problem, but words suffice easily.
Quote: mkl654321I don't disagree, but my point was that the "paradox" being discussed in this thread is due to the failure of mathematical terminology. The child knows that the egg will break. Likewise, the child knows that switching envelopes back and forth can't possibly increase the amount in either envelope.
In other words, there are times when mathematics fails to solve a problem, but words suffice easily.
Lol. Have none of youze "geniuses" asked, "What if the math (in this case) is indeed correct?" What if it were possible to switch?
None of us is able, even with "free will", to end up doing something otherwise than that we eventually (or hastily) do. Can't say, "I'm going down this path, but to be tricky in the real reality, down that one." Perhaps that new-found (quantum-relativity?) physical ability would be worth something like an EV of 1.25 against inferior opponents.
Certainly, the original problem states nothing about changing the (overall) values of the envelopes in any way or manner throughout.
Quote: GarnabbyLol. Have none of youze "geniuses" asked, "What if the math (in this case) is indeed correct?" What if it were possible to switch?
None of us is able, even with "free will", to end up doing something otherwise than that we eventually (or hastily) do. Can't say, "I'm going down this path, but to be tricky in the real reality, down that one." Perhaps that new-found (quantum-relativity?) physical ability would be worth something like an EV of 1.25 against inferior opponents.
Certainly, the original problem states nothing about changing the (overall) values of the envelopes in any way or manner throughout.
In most cases, problem-solving like this relies upon foundational premises or assumptions. For example, the original problem statement said that one envelope had twice as much as the other. That is taken as a given when addressing the problem. If the reality instead was that after one envelope was selected, the money in other envelope was always stolen by thieves, then it would clearly never be correct to switch. Alternately, if the money in the other envelope was always replaced with exactly half the amount in the opened envelope (a property which would still make the original relationship true), it would still never be correct to switch.
Quote: GarnabbyLol. Have none of youze "geniuses" asked, "What if the math (in this case) is indeed correct?" What if it were possible to switch?
None of us is able, even with "free will", to end up doing something otherwise than that we eventually (or hastily) do. Can't say, "I'm going down this path, but to be tricky in the real reality, down that one." Perhaps that new-found (quantum-relativity?) physical ability would be worth something like an EV of 1.25 against inferior opponents.
Certainly, the original problem states nothing about changing the (overall) values of the envelopes in any way or manner throughout.
We don't have to engage in "what ifs" or other such indulgences in make-believe, "lol", because we assume that the situation being posed will be happening in the real world. In other words, if it WERE possible to increase value by switching, we would not be in this universe, but rather, some other one, and in that universe, no one would have posed the question in the first place ("why, of COURSE you can gain by switching! Everyone knows that!).
- Quote: Dorothy
It would appear that by switching the man would have a 50% chance of doubling
his money should the initial envelope be the lesser amount and a 50% chance
of halving it if the initial envelope is the higher amount.
This statement is without foundation, as we don't know what distribution the dollar amounts were drawn from. However, we can make the problem more precise by specifying a distribution. - For any dollar amount x, let p(x) denote the probability that the larger amount is x. Then, if we open an envelope and find x, we can immediately apply Bayes' Theorem to find that the probability that we drew the larger envelope is p(x)/[p(x) + p(2x)] and the probability we drew the smaller envelope is p(2x)/[p(x) + p(2x)]. Thus, given that we've seen x in one envelope, the expected value of the other envelope is xp(x)/{2[p(x) + p(2x)]} + 2xp(2x)/[p(x) + p(2x)] = x{[½p(x) + 2p(2x)]/[p(x) + p(2x)]}, hence we should switch if p(x) < 2p(2x). In general, we can't determine if this is true without knowing x.
- The initial "paradoxical" analysis assumes that, seeing x in one envelope, we are equally likely to see x/2 and 2x in the other envelope. For this to be true regardless of x, p would have to be a constant function. This is impossible, though, as Σt p(t) must equal 1.
- On the other hand, it is possible to choose a distribution p which gives a seemingly paradoxical result. Wikipedia gives the example p(2n) = 2n-1/3n for every positive integer n and p(x) = 0 otherwise. Here the expected value is infinite before opening any envelope, and switching envelopes doesn't change that. After opening an envelope, we can determine our expected gain by switching, and yes we should always switch, but is that really paradoxical?