“There are two envelopes in front of you each with a non-zero number. You will receive an amount of money equal to the final envelope you choose. You are informed one has twice as much money as the other. You are then allowed to select either envelope. After you select one and before opening it you are given the option to change your mind and switch to the other one? You think to yourself that if your envelope has x dollars there is a 50% chance the other one has x/2 dollars and a 50% chance it has 2x dollars. The expected return, you compute, is .5[.5x + 2x]=1.25x which seems like a favorable gamble. Do you switch and why? Assume you are neither risk averse nor risk prone, in other words you will take any good gamble and avoid any bad one.”
The solution presented is that there is no point in switching, which is correct. But the explanation why is rather convoluted, and doesn't really get to the heart of the issue, I think. It discusses extreme scenarios, where the entire wealth of the world is involved. But it doesn't matter how much money is involved to show why switching doesn't increase your expected return. Here's the best explanation I could come up with re why switching doesn't increase your expected return, and why it isn't appropriate to use the formula stated in the problem to determine expected return.
It is not appropriate to apply the “.5[.5x + 2x]=1.25x” formula to determine expected return in this problem. Because fundamentally, this formula covers 3 possibilities and 3 distinct returns: 1) You don’t switch and get $X; 2) you switch and maybe get $2X; or 3) You switch and maybe get $0.5X.
Whereas in the actual problem there are 4 possibilities but only two distinct returns: 1) You’re holding $X, don’t switch and get $X; 2) You’re holding $X, switch and get $2X; 3) You’re holding $2X, don’t switch and get $2X; or 4) You’re holding $2X, switch and get $X. So you’re expected return if no switch (covering 1) and 3)) is 0.5[x+2x]=1.5x and your expected return if switch (covering 2) and 4)) is 0.5[2x+x]=1.5x. So no point in switching.
Now, let’s say a different game is played. In this game, a man gives you an envelope and tells you there is $X inside. He then shows you two more envelopes, tells you that one holds $2X and the other holds $0.5X, and says you may switch your envelope for one of the other two. Do you switch?
Yes, because the formula “.5[.5x + 2x]=1.25x” applies in this case – i.e. you can keep what you have and claim $X, or switch and maybe claim $0.5X, or switch and maybe claim $2X. So your expected return if switch is $1.25X. They key difference here is that there are 3 possibilities returns, not 2 as in the actual problem.
What bothers me about your explanation is that once an envelope is in your hand, the amount isn't random any longer. What is in there is what is in there.
Agreed, it isn't random. Once the envelope is in your hand, you either have $X or $2X. Those are the only two possibilities. And there are only two possibilities after switching - $X or $2X. There is no way to end up with 0.5X, X, or 2X in this problem. You can only end up with X or 2X (or, 0.5X or X, depending on how you define X). Whereas, ending up with 0.5X, X, or 2X is possible when using the formula. To me, that's a very simple explanation for why it is flawed. The Cal Tech guy may have a more complicated explanation to come to the same conclusion, but that doesn't mean it's the only explanation.
Yes, X can be any arbitrary amount, but what you're holding isn't random - you possess specific information about what is in the two envelopes to start with, that can be applied to determining expected return. Which (in my thought process here) removes the concept of "randomness" from the problem.
If a man gave you an envelope with $X in it, and then offered you a 2nd envelope to switch with, and told you there was an equal probability that the envelope contained $0.5X or $2X, then using the .5[.5x + 2x]=1.25x formula to determine expected return is correct. But that is clearly a different problem from the stated one.
Quote: MathGentMaybe the fundamental flaw in using the .5[.5x + 2x]=1.25x formula is that it assigns a single variable "X" to represent (in this particular problem) 2 distinct starting points - let's call them Y and 2Y, so we don't confuse variables. It's clear (at least to me) that we have to consider each starting point separately in order to determine the expected return. Which is what I showed in the 5th paragraph of my original 7-paragraph post. Whereas this formula tries to assign one variable "X" to represent either of the two starting points. The flaw occurs because if X=Y, then the formula states that there is a 50% chance that the return is 0.5X = 0.5Y, which is impossible in the problem. Similarly, if X=2Y, then the formula states that there is a 50% chance that the return is 2X=4Y, which is also impossible in the problem.
You're logic makes perfect sense to me, and it has from the beginning.
So, if someone says, ".5(.5x + 2x)=1.25", and therefore you should switch bets, and you switch bets and you end up with x, instead of the 2x that you originally had, is it okay to then say, "YOUR MATH F_ING SUCKS"? Just curious if that would be an appropriate response or not :)
Quote: CrystalMathYou're logic makes perfect sense to me, and it has from the beginning.
Thanks!
Quote: JyBrd0403So, if someone says, ".5(.5x + 2x)=1.25", and therefore you should switch bets, and you switch bets and you end up with x, instead of the 2x that you originally had, is it okay to then say, "YOUR MATH F_ING SUCKS"? Just curious if that would be an appropriate response or not :)
Not sure about appropriate, but understandable. Like it would be if you bet big on the flop in Hold'em, and then lost on runner Aces. :)
Quote: JyBrd0403"Today's scientists have substituted mathematics for experiments, and they wander off through equation after equation, and eventually build a structure which has no relation to reality" (Nikola Tesla)
I would skip all of the x/X's which are general multipliers of the amounts in the envelopes, and specific multipliers between the two amounts in the envelopes.
Figure envelopes A and B apart - without reference to a variable x for the amount in one of A, or B - to obtain the expected value of 1.50; figure envelopes A and B together - with reference to a constant X, from which the amounts derive - to obtain the expected value of 1.25.
Figure envelopes A and B apart as figure A and B together - with reference to a constant X for variable x - to lock in the "nutty" expected value of 1.25x/X; figure envelopes A and B together as figure A and B apart - with reference to a variable x for constant X - to lock in the "nutty" expected value of 1.50x/X.
The natural parts even out alongside the "nutty" corresponding parts' duality of referencing apart as together.
I wonder, might the "nutty" be a mechanism for the dual (quantum) "bootstrap" (to pull itself up out of the nothingness)? Not quite there yet.
Haven't really thought any of this through. Offhand, not sure about visualizing the 1.50X part.
When you choose the first envelope you get a 50/50 chance of choosing either 1x or 2x.
If you switch you will still have the same 50/50 chance of opening either 1x or 2x, so there can be no advantage to switching.
There are only two possible outcomes no matter how many times you switch or shuffle envelopes.
Each possible outcome has an equal probability of occurring.
Quote: skrbornevryminThere is no advantage in switching:
When you choose the first envelope you get a 50/50 chance of choosing either 1x or 2x.
If you switch you will still have the same 50/50 chance of opening either 1x or 2x, so there can be no advantage to switching.
There are only two possible outcomes no matter how many times you switch or shuffle envelopes.
I agree. I think the person who thought of this problem was searching for some Monte Hall type of mathematical edge by switching. Here there is none
Quote: MathGentMaybe the fundamental flaw in using the .5[.5x + 2x]=1.25x formula is that it assigns a single variable "X" to represent (in this particular problem) 2 distinct starting points
My original answer to the problem argued exactly that, and people with had better math skills than me wrote in to say that I was wrong.
Let me ask the problem this way, let's say that it is Bill Gates stuffing the envelopes. You're told that the two envelopes contain x and 2x. You pick one and OPEN it. Let's say is that $100. Do you switch?
Quote: WizardQuote: MathGentMaybe the fundamental flaw in using the .5[.5x + 2x]=1.25x formula is that it assigns a single variable "X" to represent (in this particular problem) 2 distinct starting points
My original answer to the problem argued exactly that, and people with had better math skills than me wrote in to say that I was wrong.
Let me ask the problem this way, let's say that it is Bill Gates stuffing the envelopes. You're told that the two envelopes contain x and 2x. You pick one and OPEN it. Let's say is that $100. Do you switch?
I say it doesn't matter really. Whether it's Bill Gates or not, the only result at this point is the other envelope has $200 or $50. Neither Bill Gates type money. Who stuffed it at this point doesn't matter.
Now if you asked me before opening one of two sets of envelopes, would I like to try it with Bill Gates envelopes or my local Jack in the Box cook, I'd say there'd be an advantage to taking Bill Gates.
ZCore13
Quote: WizardQuote: MathGentMaybe the fundamental flaw in using the .5[.5x + 2x]=1.25x formula is that it assigns a single variable "X" to represent (in this particular problem) 2 distinct starting points
My original answer to the problem argued exactly that, and people with had better math skills than me wrote in to say that I was wrong.
Let me ask the problem this way, let's say that it is Bill Gates stuffing the envelopes. You're told that the two envelopes contain x and 2x. You pick one and OPEN it. Let's say is that $100. Do you switch?
I think I would try to use some deductive reasoning as to whether there was $50 or $100. For instance, if the $100 envelope was in small bills, maybe 5 $20's then I would be more inclined to keep it, however, if it was a crisp $100 bill it might indicate to me that he took from a stack of $100s and the other envelope might have 2 of them.
If you have $100 in your envelope, mathematically speaking, the other envelope has $200 (2x) or $50 (.5x) in it. So, adding $200 + $50 / 2 = $125, on average there's $125 in the second envelope.
Only problem is that in reality there's an actual real amount, let's say $200, in the second envelope, that's it. If you do the experiment every time with $100 in one envelope and $200 in the other, you'll never get that $125 average. In order for that to happen you would have to do the experiment sometimes using $100 - $200, and other times use, $100 - $50, just like the math does.
The reality is something different. There's always, 100% of the time, $200 in envelope B. You don't get $50 half the time, and $200 the other half of the time. So, the math showing you have a better chance of winning more money by switching, is not based in reality. It's based on math which says envelope 2 has $50 in it half the time.
If you actually do the experiment with $100 - $200 you can see it immediately, there is never $50. The math is taking into account $50 that doesn't exist.
If you look at the problem from reality, the second envelope either has more money in it or it doesn't. That's 50/50. The end.
Quote: JyBrd0403If you look at the problem from reality, the second envelope either has more money in it or it doesn't. That's 50/50. The end.
So, you would switch then, because if it were 50/50, the other one would have, on average, 25% more money. Correct?
Quote: Zcore13I say it doesn't matter really. Whether it's Bill Gates or not, the only result at this point is the other envelope has $200 or $50. Neither Bill Gates type money. Who stuffed it at this point doesn't matter.
I think it does matter. If the Jack in the Box fry cook stuffed the envelopes, I'd take the $100, because I'm not confident he has more than $200 in wealth.
To make another point, let's that I blindfold you, flip a coin, put it under a cup, and then remove your blindfold. I promise you that if you can guess the face up side under the cup, then I'll give you $100 and if you're wrong, you give me $50. What is your advantage. Assume you have no way to randomize your prediction.
I would say "yes," but I think other mathematicians, especially frequentists, would object, saying the coin face is not random, thus you can't ascribe probabilities to it.
Quote: WizardSo, you would switch then, because if it were 50/50, the other one would have, on average, 25% more money. Correct?
No, it doesn't matter if you switch or not. You're first pick you had a 50% chance of getting x and a 50% chance of getting 2x. If you switch you have a 50% chance of getting x and a 50% chance of getting 2x.
The Tesla point, is that there is NOT 25% more money on average in envelope 2. There's only x ($100) and 2x($200), there is NO .5x($50). .5x does not exist in reality. There's no envelope containing $50 at any time in the experiment, only in the math. So, the math is obviously flawed here.
The math should show chances for envelope 2 as .5(x + 2x), choosing envelope 2 there's a 50% chance of it being $100 and a 50% chance of it being $200. The flawed math shows .5(.5x + 2x). The reason it's flawed is that .5x($50) doesn't exist in the experiment. There is never an envelope with $50 in it. So .5($50+$200) is obviously flawed. Which becomes extremely clear when you stick $100 in envelope 1 and stick $200 in envelope 2 and do the experiment.
Quote: WizardLet me ask the problem this way, let's say that it is Bill Gates stuffing the envelopes. You're told that the two envelopes contain x and 2x. You pick one and OPEN it. Let's say is that $100. Do you switch?
Yes, I suppose I would. But not by applying the .5[.5x + 2x]=1.25x formula (given where I am now in my thought process of this problem). I'd do it by assigning a probability P to the envelope stuffer that he'd put $300 of his own money into the two envelopes rather than $150.
For Bill Gates, I'd assign P=0.49 (not 0.5, as in the formula, because it would be incorrect, I think, to assign equal probability to the greater-amount case no matter who stuffs them - Bill Gates just gets us closest to 0.5 :) ). So my expected return if I switched would be 0.49*200 + 0.51*50 = 123.5. So I'd switch.
Whereas for Jack in the Box Cook, I'd assign P=0.05. Then my expected take would be 0.05*200 + 0.95*50 = 57.5. So I wouldn't switch.
The break even point would be for P=1/3. If I was told that a random American stuffed the envelopes, I don't think I'd switch; I'd assume the mean value of P must be < 1/3, given how the wealth in America is distributed. :)
I understand the point of your question though - it, and this whole discussion, is quite thought-provoking!
Quote: WizardTo make another point, let's that I blindfold you, flip a coin, put it under a cup, and then remove your blindfold. I promise you that if you can guess the face up side under the cup, then I'll give you $100 and if you're wrong, you give me $50. What is your advantage. Assume you have no way to randomize your prediction.
I would say "yes," but I think other mathematicians, especially frequentists, would object, saying the coin face is not random, thus you can't ascribe probabilities to it.
I'd say yes too, in a heartbeat, especially if I could play, say, 100 times. Do you happen to know any frequentists who aren't particularly attached to their money? :)
Even if the flipper didn't actually flip the coin, but was allowed to select which face showed up under the cup each time (basing his decision on my guessing pattern up to that point), I'd STILL play with that 2:1 ratio of payouts. If it were $51 vs. $50 though, I probably wouldn't.
I don't know anything about "frequentists", but this sentence in the Wikipedia article:
"Frequentist inference has been associated with the frequentist interpretation of probability, specifically that any given experiment can be considered as one of an infinite sequence of possible repetitions of the same experiment, each capable of producing statistically independent results."
... doesn't seem to preclude ascribing probabilities to this problem. Rather, it seems like this method is applied when it isn't obvious how to assign the probability of a particular experiment outcome.
Quote: WizardMy original answer to the problem argued exactly that, and people with better math skills than me wrote in to say that I was wrong.
So they said you were wrong about "why" switching doesn't increase your return, and then provided a more complex way to reach the same conclusion? That in and of itself doesn't disprove your reasoning, it only shows that there is a different way to reach the same conclusion, doesn't it?
I'd love to see an example where using your/our concepts fails to arrive at the proper conclusion, but using their method does (re a problem comparable to the one posed).
Quote: MathGentI'd do it by assigning a probability P to the envelope stuffer that he'd put $300 of his own money into the two envelopes rather than $150.
This got me thinking ... barring any other information, maybe the right thing to do is to assign P to be the ratio of the smaller of the two possible envelope sums (150 in this case) to the sum of the smaller and larger possible sums (450 in this case). That is, P:(1-P) = S1:S2, where S1 is the smaller of the two possible envelope sums and S2 is the larger of the two possible envelope sums.
So for this problem, P=150/450 = 1/3. So return = 1/3*200 + 2/3*50 = 100, so no point in switching.
If the problem was that the envelopes contained either x or 4x, and you opened one containing $100, then the return = 1/5*400 + 4/5*25 = 100, so no point in switching.
It makes sense to me to assign the ratio of the two probabilities = ratio of the two possible envelope sums, regardless of the actual amounts.
Quote: Gabes22Quote: WizardQuote: MathGentMaybe the fundamental flaw in using the .5[.5x + 2x]=1.25x formula is that it assigns a single variable "X" to represent (in this particular problem) 2 distinct starting points
My original answer to the problem argued exactly that, and people with had better math skills than me wrote in to say that I was wrong.
Let me ask the problem this way, let's say that it is Bill Gates stuffing the envelopes. You're told that the two envelopes contain x and 2x. You pick one and OPEN it. Let's say is that $100. Do you switch?
I think I would try to use some deductive reasoning as to whether there was $50 or $100. For instance, if the $100 envelope was in small bills, maybe 5 $20's then I would be more inclined to keep it, however, if it was a crisp $100 bill it might indicate to me that he took from a stack of $100s and the other envelope might have 2 of them.
I like your reasoning here.
Quote: MathGentSo my expected return if I switched would be 0.49*200 + 0.51*50 = 123.5. So I'd switch.!
So, if you find out that the experiment only contained a $100 envelope and a $200 envelope. You still think you have a 50% chance of picking an envelope with $50 in it??? To look at it another way there is never going to be a .5x, $100 is either 2 x $50 (2x), or it is 1/2 of 2x (x). Either case there is NO .5x, $100 is never .5 of x, only .5 of 2x.
Quote: JyBrd0403So, if you find out that the experiment only contained a $100 envelope and a $200 envelope. You still think you have a 50% chance of picking an envelope with $50 in it??? To look at it another way there is never going to be a .5x, $100 is either 2 x $50 (2x), or it is 1/2 of 2x (x). Either case there is NO .5x, $100 is never .5 of x, only .5 of 2x.
Of course not, I still stand by what I originally said on that - same as you. Once you know there is only $100 and $200 envelopes, then the expected (average) return is $150 no matter what, so no point in switching.
But the re-posed question that Wizard asked was different, in my opinion, from the original problem. In the reposed problem, there could be $50 and $100 in the two envelopes, or $100 and $200. We don't know which. All we know is that Bill Gates put some of his money in the two envelopes. In that case, I'll take the chance that two envelopes might be $100 and $200.
Quote: MathGentOf course not, I still stand by what I originally said on that - same as you. Once you know there is only $100 and $200 envelopes, then the expected (average) return is $150 no matter what, so no point in switching.
But the re-posed question that Wizard asked was different, in my opinion, from the original problem. In the reposed problem, there could be $50 and $100 in the two envelopes, or $100 and $200. We don't know which. All we know is that Bill Gates put some of his money in the two envelopes. In that case, I'll take the chance that two envelopes might be $100 and $200.
You don't know the amounts. You only know that one is double the other. And Bill Gates was not in the picture. It could just have easily been a transient that filled the envelopes. You don't even know the amounts that are possible until you pick the first one. At that point you know the other can only be half or double what you just chose.
I don't think there is a formula in the world that would tell you which way to go. It's 50/50 to start that you might get the higher value one and it's 50/50 after you chose one if the remaining one is higher or lower.
ZCore13
Quote: MathGentOf course not, I still stand by what I originally said on that - same as you. Once you know there is only $100 and $200 envelopes, then the expected (average) return is $150 no matter what, so no point in switching.
But the re-posed question that Wizard asked was different, in my opinion, from the original problem. In the reposed problem, there could be $50 and $100 in the two envelopes, or $100 and $200. We don't know which. All we know is that Bill Gates put some of his money in the two envelopes. In that case, I'll take the chance that two envelopes might be $100 and $200.
The re-posed question is the same. Envelope 1 either contains x or 2x, Envelope 2 contains either x or 2x. So, if you pick envelope 1 and there's $100 in it, then $100 is either 2x itself or it's 1/2 of 2x which is x itself. Before you get started you know that one envelope is contains x, and the other envelope contains 2x. There can't be an equation with .5x because there is no envelope that contains .5x. You don't need the $100 and $200, or $50 and $100 to be revealed to know that, you know from the beginning that there is no chance of picking .5x. It doesn't exist. So, you never have a 50% chance of picking an envelope with .5x in it.
x = amount in 1 envelope
2x = amount in 2nd envelope
.5x = Non reality (No envelope contains .5x)
The two possibilities are that you will get 2/1.25 as much as the expected value, or 0.5/1.25 as much as the expected value.
0.5(1.6Y + 0.4Y) = 0.5(2Y) = Y = Switching has the same expectation as your expected starting value.
x = money in envelope
y = envelope 1
z = envelope 2
y = .5(2x + x) = 1.5x
z = .5(2x + x) = 1.5x
y = 1.5x
z = 1.5x
Expected value of envelope 1 is 1.5x. If you pick Envelope 1 and decide to switch, your expected value for envelope 2 is 1.5x.
Quote: JyBrd0403The re-posed question is the same. Envelope 1 either contains x or 2x, Envelope 2 contains either x or 2x. So, if you pick envelope 1 and there's $100 in it, then $100 is either 2x itself or it's 1/2 of 2x which is x itself. Before you get started you know that one envelope is contains x, and the other envelope contains 2x. There can't be an equation with .5x because there is no envelope that contains .5x. You don't need the $100 and $200, or $50 and $100 to be revealed to know that, you know from the beginning that there is no chance of picking .5x. It doesn't exist. So, you never have a 50% chance of picking an envelope with .5x in it.
x = amount in 1 envelope
2x = amount in 2nd envelope
.5x = Non reality (No envelope contains .5x)
I agree that there is no 0.5x, as you've laid it out. But once you open an envelope and reveal $100, the other envelope either contains $50 or $200, right? You're not debating that point, are you?
Let's look at this a different way.
Suppose you know that a girl named Sally stuffed the envelopes with her entire net worth, which you know is < $300. Would you switch envelopes if the one you had in hand contained $100? Of course not, because there is 0% chance the other envelope contains $200.
Now suppose that Sally has 10 $5 bills, 10 $10 bills, and 10 $20 bills in her purse, and is told to put either the 10 fives and 10 tens into two separate envelopes, or the 10 tens and 10 twenties into two separate envelopes. And, she must flip a coin to decide which of those two things to do: tails=5s and 10s, heads=10s and 20s. So she flips the coin and follows the instructions. You then walk into the room, Sally hands you the two envelopes, tells you one contains x and the other 2x, you open one and it contains 10 tens. She then tells you exactly what I explained above re the circumstances of the envelope stuffing - i.e. how many and what denomination of bills she had, and how she had to flip a coin to decide how to fill the two envelopes. But, she doesn't tell you the result of the coin flip. Would you switch? Yes, because, applying the 0.5*$50 + 0.5*$200 = $125 formula to determine whether to switch or not is appropriate in this case, right?
The point I'm trying to make here is that if you know something about the circumstances re the envelope stuffing, it may be usable to determine the most rewarding course of action. In other words, the circumstances re the envelope stuffing is not necessarily irrelevant to whether or not to switch.
But suppose, in that second example, Sally tells you nothing about the circumstances re the envelope stuffing - all you know is that one envelope contains x and one 2x, and the one you opened has $100 in it. Do you switch then?
I certainly agree that, after choosing one envelop but before opening it, there is no point in switching. For exactly the reason you said (and what I said in my original post).
So the question is ... does the problem change once you open one envelope and reveal $100 inside?
I think you say "no", there's still no point in switching. And I agree. In this case, we know nothing about the circumstances re the envelope stuffing. So, lacking any other information, we should fall back on the reasoning we used before opening one envelope, re concluding there is no point in switching.
The point I was trying to make previously about contrasting Bill Gates as the stuffer to Jack in the Box guy as the stuffer (where they are both stuffing their own money to give away) is that just knowing that much gives us some additional information re the circumstances of the stuffing on which to make a decision re whether to switch or not, once you open an envelope and reveal $100 (there's no point in switching before you open an envelope, as we've already agreed). The question is, is that useful and usable information, or not, re deciding whether to switch?
I'm starting to think that my previous thoughts on that were wrong. Now I'm thinking that at a low revealed amount (say $2), there is no point in switching, regardless of Bill vs. Jack. But as the revealed amount increases, you should start thinking there is a good reason NOT to switch in Jack's case, before reaching the same conclusion in Bill's, due to the differences in the estimation of their net worth. Similar to the throught process in my first Sally example above.
Quote: MathGentI agree that there is no 0.5x, as you've laid it out. But once you open an envelope and reveal $100, the other envelope either contains $50 or $200, right? You're not debating that point, are you?
Let's look at this a different way.
Suppose you know that a girl named Sally stuffed the envelopes with her entire net worth, which you know is < $300. Would you switch envelopes if the one you had in hand contained $100? Of course not, because there is 0% chance the other envelope contains $200.
Now suppose that Sally has 10 $5 bills, 10 $10 bills, and 10 $20 bills in her purse, and is told to put either the 10 fives and 10 tens into two separate envelopes, or the 10 tens and 10 twenties into two separate envelopes. And, she must flip a coin to decide which of those two things to do: tails=5s and 10s, heads=10s and 20s. So she flips the coin and follows the instructions. You then walk into the room, Sally hands you the two envelopes, tells you one contains x and the other 2x, you open one and it contains 10 tens. She then tells you exactly what I explained above re the circumstances of the envelope stuffing - i.e. how many and what denomination of bills she had, and how she had to flip a coin to decide how to fill the two envelopes. But, she doesn't tell you the result of the coin flip. Would you switch? Yes, because, applying the 0.5*$50 + 0.5*$200 = $125 formula to determine whether to switch or not is appropriate in this case, right?
Right, I agree if Envelope 1 has $100 in it then Envelope 2 could have $50 or $200 as far as you know. What I'm saying is the equation .5(2x + .5x) is invalid to use here.
For your Sally question, yes I agree, it would be appropriate to switch and to use the .5(2x + .5x) formula there if you got $100 on envelope 1. Of course, you would always switch if envelope 1 is $50, and never switch if envelope 2 is $200, and the equation is once again invalid if you get $50 on the first pick or $200 on the first pick.
Edit. I made a mistake here - "and never switch if envelope 2 is $200" - it should read, "and never switch if envelope 1 is $200". So, 1st pick $50, always switch, 1st pick $200, never switch. Sorry about that.
Quote: JyBrd0403Right, I agree if Envelope 1 has $100 in it then Envelope 2 could have $50 or $200 as far as you know. What I'm saying is the equation .5(2x + .5x) is invalid to use here.
For your Sally question, yes I agree, it would be appropriate to switch and to use the .5(2x + .5x) formula there if you got $100 on envelope 1. Of course, you would always switch if envelope 1 is $50, and never switch if envelope 2 is $200, and the equation is once again invalid if you get $50 on the first pick or $200 on the first pick.
Edit. I made a mistake here - "and never switch if envelope 2 is $200" - it should read, "and never switch if envelope 1 is $200". So, 1st pick $50, always switch, 1st pick $200, never switch. Sorry about that.
Cool, I think we're in agreement on all of this now. And no problem re the typo - I would have understood what you meant.
I was thinking last night that there is another scenario where just opening the envelope would provide valuable info re whether to switch or not. And that is ... if the first envelope contains an odd amount of money (e.g. $100.37 if change is involved, or $101 if only bills are involve). Then you'd always switch, since the amount in the open envelope is not divisible by 2.
More evidence that opening the envelope can affect the decision whether to switch.
Quote: MathGentCool, I think we're in agreement on all of this now. And no problem re the typo - I would have understood what you meant.
I was thinking last night that there is another scenario where just opening the envelope would provide valuable info re whether to switch or not. And that is ... if the first envelope contains an odd amount of money (e.g. $100.37 if change is involved, or $101 if only bills are involve). Then you'd always switch, since the amount in the open envelope is not divisible by 2.
More evidence that opening the envelope can affect the decision whether to switch.
$100.37 is not evenly divisibly by 2. $101 is, unless the rules state that there is no change involved.
ZCore13
Quote: Zcore13$100.37 is not evenly divisibly by 2. $101 is, unless the rules state that there is no change involved.
I said "$101 if only bills are involved". I'm unaware of the existence of any 50 cent bills.
But there are 1/2 cent coins. I got one as change a long time ago.Quote: MathGentI said "$101 if only bills are involved". I'm unaware of the existence of any 50 cent bills.
Quote: mipletBut there are 1/2 cent coins. I got one as change a long time ago.
Just when you think that you've figured it out.
Quote: mipletBut there are 1/2 cent coins. I got one as change a long time ago.
I had no idea, but I looked it up. It was last minted in 1857. There were also 2-cent, 3-cent, and 20-cent coins minted in the 1800s. Who knew?
I fully agree that x and 2x is the right way to look at the original problem. However, the problem at hand is what is the flaw in the 0.5*(0.5x + 2x)=1.25 x argument before the envelope is opened? What rule of mathematics is this breaking?
Also, if anybody with at least a 4-year degree in mathematics wishes to write an answer to be directly posted to my MathProblems.info site, please have at it. I'm not looking to paraphrase something you posted here but a beginning to end document ready to use.
As far as a math or other degree, no one has the advantage when everyone is "in the dark". Especially, ie, when trying to figure how the "wrong" stuff can't be wrong in every way. When an extraneous or imaginary solution has no real or otherwise suitable counterpart, then the entire thought process to blame must be extraneous or imaginary. Hence, no way to understand such from the real side of affairs. Eg, Hawking played around with imaginary time to avoid the "big bang/crunch", with our universe's time as the imaginary part. The only matter of significance here, in my opinion.
Add-on, almost forgot. Bill Gates et al didn't get rich by giving it away/being easy to read.
Quote: MathGentThe solution presented is that there is no point in switching, which is correct. But the explanation why is rather convoluted, and doesn't really get to the heart of the issue, I think.
Well the solution is simple. The game is symmetric (as long as you have the option to switch every time), so each envelope is as good as the other one. There is really no convoluted explanation needed to explain the game.
Where do I miss the point in your question ?
Quote: MangoJWell the solution is simple. The game is symmetric (as long as you have the option to switch every time), so each envelope is as good as the other one. There is really no convoluted explanation needed to explain the game.
Where do I miss the point in your question ?
I think the point of the question is that there is an assertion that
which itself is based on EV = average(EV1, EV2) where EV1 is the EV of one outcome and EV2 is the EV of the alternative outcome. That's at odds with the obvious, intuitive, and undeniable reality.Quote:The expected return, you compute, is .5[.5x + 2x]=1.25x
Quote: Wizard
I fully agree that x and 2x is the right way to look at the original problem. However, the problem at hand is what is the flaw in the 0.5*(0.5x + 2x)=1.25 x argument before the envelope is opened? What rule of mathematics is this breaking?
That's the first time I come accross the 2 envelope paradox and I found it quite interesting.
I checked wikepedia and other sources on the net that gives an explanation of the flaw in the 1,25x solution quoting prior and posterior probabilities and infinite uniform distributions.
But the simple way I understand why it is flawed is the following.
Calculating the Ev using 50% for 2x and 50% for X/2 is wrong as no such probabilities exist.
It is wrong to say that there is 50:50 for 2x and x/2
Instead After the Choice there are IF statements that say:
If the current envelope x is the Lower amount then the other Envelope is 2x. ie if x is the Low Amount then 100% the other envelope is 2x
If the current envelope x is the higher amount then the other Envelope is X/2. ie if x is the High Amount then 100% the other envelope is x/2.
So the Ev calculation can only be made with respect to the prior probabilities to give both choices the Ev of 1.5L (L been the lower amount)
ie
If Current envelope x = L then 100% Other Envelope is 2L 50% for the If statement
If Current envelope x = 2L then 100% Other Envelope is L 50% for the If statement
Giving the Ev = 1.5L
(bold text added to highlight this statement)Quote: WizardI'm not satisfied with my explanation to that problem either. However, there doesn't seem to be agreement in the general math community where the flaw is in the .5[.5x + 2x]=1.25x argument. I've argued with this for several hours with a friend who had a Ph.D. from Cal Tech over many years and his position is that it is flawed essentially because whoever is stuffing the envelopes has a finite amount of money, which after pages of explanation can be shown to disprove the notion that the other envelope is 50/50 between the two amounts. I initially thought you didn't even need to resort to this argument, but my friend caused me to have doubts.
What bothers me about your explanation is that once an envelope is in your hand, the amount isn't random any longer. What is in there is what is in there.
Hi Wizard,
I was wondering if this same condition (what was once unknown now being considered "fixed" since new info has been received) also applies to the "Two Dice Problem"?
Quote: Two Dice PuzzleYou have two 6-sided dice in a cup. You shake the dice, and slam the cup down onto the table, hiding the result. Your partner peeks under the cup, and tells you, truthfully, "At least one of the dice is a 2."
What is the probability that both dice are showing a 2?
Quote: AyecarumbaHi Wizard,
I was wondering if this same condition (what was once unknown now being considered "fixed" since new info has been received) also applies to the "Two Dice Problem"?
I don't see any connection.
Let me ask the forum this:
Suppose you flip a coin in the dark and by feel place a cup over the coin. What is the probability the coin landed on tails?
Quote: WizardI don't see any connection.
Mike, can we ban Ayecarumba for mental cruelty $;o)
Actually... I'm self suspending until the site management put the old style sheet back.
Hope to see ya'll soon.
Quote: WizardI don't see any connection.
Let me ask the forum this:
Suppose you flip a coin in the dark and by feel place a cup over the coin. What is the probability the coin landed on tails?
The connection I was hoping for is that the new information, "At least one of the dice is a two." makes one of the former variables "fixed", just as having an envelope in hand removes possible outcomes.
As to the coin question: 50%, but...
If the coin is tossed and caught, it has about a 51% chance of landing on the same face it was launched. (If it starts out as heads, there's a 51% chance it will end as heads).
If the coin is spun, rather than tossed, it can have a much-larger-than-50% chance of ending with the heavier side down. Spun coins can exhibit "huge bias" (some spun coins will fall tails-up 80% of the time).
If the coin is tossed and allowed to clatter to the floor, this probably adds randomness.
If the coin is tossed and allowed to clatter to the floor where it spins, as will sometimes happen, the above spinning bias probably comes into play.
A coin will land on its edge around 1 in 6000 throws, creating a flipistic singularity.
The same initial coin-flipping conditions produce the same coin flip result. That is, there's a certain amount of determinism to the coin flip.
A more robust coin toss (more revolutions) decreases the bias.
-- Dynamical Bias in the Coin Toss, Persi Diaconis, Susan Holmes, and Richard Montgomery
Quote: AyecarumbaAs to the coin question: 50%, but...
Quote: AceTwoCalculating the Ev using 50% for 2x and 50% for X/2 is wrong as no such probabilities exist.
It is wrong to say that there is 50:50 for 2x and x/2
Never mind the physics of the flip. I'm trying to make a point.
Above, the point was made that once an envelope is chosen, it is no longer random whether it is the low or high one. However, if this is true, how can the flip be 50/50 when it already has been flipped?
Quote: WizardNever mind the physics of the flip. I'm trying to make a point.
Above, the point was made that once an envelope is chosen, it is no longer random whether it is the low or high one. However, if this is true, how can the flip be 50/50 when it already has been flipped?
Hmm.... If the flip was an event with two possible outcomes, each equally likely, the unknown outcome of that event still carries the possibility of either outcome, and therefore the probability of every possible outcome.
The cat is alive and dead until you open the box.
Quote: AyecarumbaHmm.... If the flip was an event with two possible outcomes, each equally likely, the unknown outcome of that event still carries the possibility of either outcome, and therefore the probability of every possible outcome.
The cat is alive and dead until you open the box.
I'm inclined to agree with this. A playing card or tile can be in your hand or in a deck but no matter where it physically is, it has to be treated as unknown, in whatever the context, until it's turned over and it can be identified.
When I first looked at this problem, it looked as though it was related to, or an iteration of, the Monty Hall problem, but there are big differences.
In the MH problem, what's behind the door is revealed before you're given the option to switch and there are three doors in total. Here there are only two, and you're given the option to switch before you can open your first-choice envelope.
Again, no matter whose hands that unopened envelope is in, it still has to be treated as unknown; in the MH problem, the cat's out of the box, so to speak, as soon as the host opens your first-choice door....
Quote: docbrockA playing card or tile can be in your hand or in a deck but no matter where it physically is, it has to be treated as unknown, in whatever the context, until it's turned over and it can be identified.
Exactly. A deck of cards is randomly shuffled and a card placed face down on the table. Nobody has a hard time saying it has a 50/50 chance of being red or black, even though it has already been chosen.
That said, you choose an envelope but don't open it yet. Can or can't we say it has a 50/50 chance of containing the small or large amount?
An unknown amount of money, X, has been placed into two envelopes. One envelope has 1/3X and the other has 2/3X.
The first unknown is: How big is X?
The 2nd unknown is : Which envelope has the larger amount of money?
Your desire is to maximize the amount of money that you get.
The 1st unknown is very important. X could be 15 cents, or 30cents or $300 or $9,000,000. The problem does not state whether all amounts are equally probable, or whether we assume that a real person in a world of limited denominations and resources has stuffed the envelopes. In a mathematical world, one envelope could have 1.5 cents and the other envelope could have 0.75 cents, but that is not possible in the real world.
The 2nd unknown is initially less important. There is no reason to prefer one envelope over the other, but there is only a factor of two at stake.
Upon opening the 1st envelope and finding, say, $100
1. you have have gained a lot of information about the first unknown. You now know that the initial amount of money was either $150 or $300. In a mathematical world, those have equal probability. In the real world of limited denominations and resources one might judge those to have slightly different probabilities.
2. You have gained no knowledge (in a mathematical world) or a little knowledge (in a real world of limited denominations and resources) about which envelope has the larger amount of money.
There are now two possible prospects or 'states.' There was $300 placed in the envelopes AND the other envelope has more money. OR there was $150 placed in the envelopes AND the envelope in your hand has the larger amount of money. If these two prospects or states are judged to be equally probable, then you should take the second envelope to maximize your expected value.
At the outset you have two choices: open one envelope or open both envelopes (and receive the money from the 2nd envelope.) Opening the first envelope will always provide information that creates an (EV-based) incentive to open the 2nd envelope. There is a path-dependence effect.