The Conundrum

A man is presented with two envelopes full of money. One of the envelopes

contains twice the amount as the other envelope. One the man has chosen

his envelope, opened and counted it, he is given the option of changing

it for the other envelope. The question is, is there any gain to the man

in changing the envelope?

It would appear that by switching the man would have a 50% chance of doubling

his money should the initial envelope be the lesser amount and a 50% chance

of halving it if the initial envelope is the higher amount. Thus, let x be

the amount contained in the initial envelope and y be the value of changing it:

y = 0.5*(x/2) + 0.5*(2x)

Let’s say that the initial envelope contained $100 (so that x = $100). There should

be a 50% chance that the other envelope contains either 2 * $100 = $200 (2x)

or a 50% chance that the other envelope contains (1/2) * $100 = $50 (x/2).

In such a case, the value of the envelope is:

$125 = 0.5*($100/2) + 0.5*(2*$100)

This inequality can be shown by simplifying this equation:

y = (0.5)*(x/2) + (0.5)*(2x) = (5/4)x

This implies that the man would, on average, increase his wealth by 25% simply

by switching envelopes! How can this be?

--Dorothy

I'll keep it brief for now, just to get the ball rolling. First, if the person offering the envelopes is going to be uncomfortable or unable to put $200 in one of them, then don't switch. However, if I may, let's assume the person offering the envelopes could easily afford to part with 2x the amount of your envelope. Do you still switch?

I've had people much more educated in math and science disagree with me on this, at least on the reason, but I say that with zero information about the wealth or behavior of the envelope stuffer, then your odds are the same switching or not switching.

If you argue that you should switch, should that argument hold BEFORE you open your envelope? Assume your envelope has $x. Wouldn't the expected value of the other envelope be .5*(x/2) + .5*2x=1.25x? If it were, and the host let you switch back and forth as much as you wanted without opening the envelopes, then indeed, the expected value argument would lead you to keep switching infinitely.

I contend that you can't formulate an expected value without knowing the possible outcomes and probabilities of the other envelopes. If your envelope has $100, then I contend it doesn't mean there is not a 50/50 chance the other one has $50 or $200. It is not a random variable. There is either a 100% chance is has $50 or a 100% chance it is has $200.

Another way to think about it is to call the difference in the envelopes $y. By switching you will either gain or lose $y with 50/50 chance. Thus switching is a breakeven bet.

That is enough for now. I'll await more comments before going further.

Edit:

Dagnabit! Once again, the Wizard posted while I was typing! I need a utility that will tell me who else is in the process of posting.

I don't need to refute the math, though I'm certain it is refutable; the effort would be pointless. It's like the ancient Greek "paradox" where a runner ran half the distance to the finish line, then half of the remainder, and so forth, so that he never actually got there--he just kept halving the remaining distance. I remember reading about this when I was a very little kid--my reaction was "that's ridiculous", as obviously at some point the runner would complete ALL of the remaining distance, not just half of it. However, I could not refute the math until I learned about summing an infinite series, much later, when I was a not-so-little kid.

Quote:Wizard

Another way to think about it is to call the difference in the envelopes $y. By switching you will either gain or lose $y with 50/50 chance. Thus switching is a breakeven bet.

Because I'm an English major and not a math major, I'll try to deal with this problem with words rather than numbers...

It seems to me that whether you

pick and open,

pick, switch, and open,

pick, think about switching and don't, then open the original one,

pick, switch, then switch back, standing there in indecision so long that the sun heats up one envelope and sets fire to it, thus forcing you to choose the remaining envelope,

or the guy offering you the envelopes kicks you in the balls and snarls, "Pick one already or I'll kill you" and you hastily grab one,

the ultimate choosing of the envelope is random, no matter how much fiddling and waffling goes on before one of the frickin' envelopes is finally opened. Therefore, the expected value of switching is +0.

I remember hearing a variation of this used to describe the difference between a mathematician and an engineer. They are each presented with the same problem: You are 10 feet away from a wall. Against the wall and facing you is a beautiful, naked, willing, young lady. Every one minute, you are allowed to advance one-half the distance toward her. Will you ever get there in a finite amount of time?Quote:mkl654321It's like the ancient Greek "paradox" where a runner ran half the distance to the finish line, then half of the remainder, and so forth, so that he never actually got there--....

The mathematician's response, supposedly, is that with the ever-decreasing steps, one can only approach and never actually reach the objective in a finite amount of time.

The engineer's response, supposedly, is, "Yes, I will get there, for all practical purposes."

Yes, that's sexist.

In any case, if you know the distribution, you should be be able to tell the odds of you gaining or loosing by switching, after looking at the original amount, and most of the time, they won't be 50/50. And if the distribution is unknown, there is nothing to be gained by knowing the first amount.

LOL!Quote:mkl654321...standing there in indecision so long that the sun heats up one envelope and sets fire to it...

...the guy offering you the envelopes kicks you in the balls and snarls, "Pick one already...

I could not have said it better myself.

---

I agree that it's a 50/50 situation.

NOT 50% that the other envelope will be double or higher or whatever, etc., but 50% that you picked the higher valued envelope to begin with.

Two part confused-by-the-conundrum posting---Quote:DJTeddyBearNOT 50% that the other envelope will be double or higher or whatever, etc., but 50% that you picked the higher valued envelope to begin with.

Part #1: I do not yet see why "50% that you picked the higher valued envelope to begin with" does not lead to "50% that the other envelope will be the lower valued one." I think it is more likely that the probabilities are being used improperly to calculate an expected value.

Part #2: The only twist I have come up with is that the amount of money that can be in an envelope is not a continuous function; money comes in discrete units. If you examine one envelope and find that the money in it could not be evenly divided into half (eg., $247.59), then you must have gotten the smaller amount and should swap. I'm not sure how to follow through on this notion in the event that your envelope contains a nice even amount.

By the way, this is problem #6 on my math' rel='nofollow' target='_blank'>http://mathproblems.info/]math problems site. Still, I'm not entirely happy with my explanation there either, nor any others I've seen.

Quote:

Part #1: I do not yet see why "50% that you picked the higher valued envelope to begin with" does not lead to "50% that the other envelope will be the lower valued one." I think it is more likely that the probabilities are being used improperly to calculate an expected value.

It is not 50% that you picked the higher valued envelope. In general, it just an unknown value.

It is a well-known fallacy to treat any unknown binary value as 50% of each. Like what is the probability that you see a dinosaurus on the street tonight? 50% - either you see it, or you do not :)

Quote:Doc

Part #2: The only twist I have come up with is that the amount of money that can be in an envelope is not a continuous function; money comes in discrete units. If you examine one envelope and find that the money in it could not be evenly divided into half (eg., $247.59), then you must have gotten the smaller amount and should swap. I'm not sure how to follow through on this notion in the event that your envelope contains a nice even amount.

We can agree to round to the nearest cent when we divide. Or better yet, to the nearest two cents, so that you'll never see odd amounts to begin with. :)

The actual twist is that the probabilities of different amounts are different, they cannot all be equal.

For example, suppose, you know that the person offering you the envelopes picked a uniformly distributed value between 0 and 1, doubled or halved it with a 50/50 probability, and then multiplied the result by a fixed dollar amount, that's known to you (say, $1000).

Obviously, you won't switch if you see more than 1000 in your envelope. In fact, in this case, you should not switch as long as your value is $500 or more, and should if it is less than $500.

The key is knowing the distribution - then you should be always be able to calculate when to switch.

I agree that "Something either happens or it doesn't happen" does not mean that it has a 50% chance of happening. But I disagree that this applies to the present problem.Quote:weaselmanIt is a well-known fallacy to treat any unknown binary value as 50% of each. Like what is the probability that you see a dinosaurus on the street tonight? 50% - either you see it, or you do not

Suppose that you have a very biased coin. It does not have anywhere close to a 50% probability of coming up heads (or tails). You flip the coin and ask me to guess the result. Even not knowing which direction the bias is in, I still have a 50% chance of getting it right. Similarly, I have a 50% chance of initially selecting the envelope with the larger amount of money.

Edit: Perhaps I misinterpreted your post and owe you an apology. On second reading, it occurs to me that your statement "fallacy to treat any unknown binary value as 50% of each" might have been a comment on the improper way that the EV was calculated in the initial post. If that is what you meant, then I agree with you -- at least in concept, although I don't know the proper way to calculate the EV, if there is one.

There is a point where, for each person, the amount in that envelope is "enough." If that's the case, it isn't worth the risk of swapping to get half of that amount because there isn't much marginal benefit associated with the potential to double the amount. Obviously this is subjective, and it's also situational. I think it's the exception to the Wiz's "thou shalt not hedge thy bets" commandment (life-changing amounts of money).

Say I'm planning a trip to Vegas in October, and airfare is pricing out at $550. We play this game and you give me 1 envelope. I open it and discover $600. I could either keep the $600 and have guaranteed free airfare, or trade it in for a theoretically +EV chance at $1200. Personally, I'd take the $600. That is enough to pay for my immediate and large expense.

We did a similar exercise with the lottery, with a 10%/1000% split for the second envelope. If you won $500,000 in the first envelope, would you trade it in for a chance at $50k/$5MM?

Quote:Doc

Suppose that you have a very biased coin. It does not have anywhere close to a 50% probability of coming up heads (or tails). You flip the coin and ask me to guess the result. Even not knowing which direction the bias is in, I still have a 50% chance of getting it right. Similarly, I have a 50% chance of initially selecting the envelope with the larger amount of money.

This is not true. At least, not always true. It will depend on your strategy, and the actual distribution. For example, in the case of a biased coin, if you always say "tails", your chance of being right will be less or more than 50% depending on which way the bias is. If the coin always shows "heads", you will never be right at all.

With the envelopes, it is a little different. Because they are identical, before the envelope is open, your chance of picking one with the larger amount is 50%, but when one of the amounts becomes known, the probabilities change, it is no longer 50% that your envelope has the larger amount, the actual value depends on the distribution.

Quote:weaselmanThis is not true. At least, not always true. It will depend on your strategy, and the actual distribution. For example, in the case of a biased coin, if you always say "tails", your chance of being right will be less or more than 50% depending on which way the bias is. If the coin always shows "heads", you will never be right at all.

It depends on how or why Doc called tails. If he chose tails randomly with a 50% chance, then he would be right, and his probability of winning the toss on the biased coin would be 50%. Doc would also be right if the probability that the coin was biased in favor of heads and tails were equal.

However, if Doc always chose tails, and the coin were a store bought biased coin, then I think Doc's probability of winning would be more than 50%. This is because most people call heads. So I could envision somebody making a biased coin in favor of tails, as a cheating device.

Quote:weaselmanWith the envelopes, it is a little different. Before the envelope is open, your chance of picking one with the larger amount is 50%, but when one of the amounts becomes known, the probabilities change, it is no longer 50% that your envelope has the larger amount, the actual value depends on the distribution.

I agree, but it is hard to explain why.

Quote:rdw4potusWe used this problem in one of my classes. The point was that the decision doesn't depend on the math so much as it depends on the amount of money in the first envelope...

That is a valid point. However, I think getting into the utility of money is confusing the issue and getting off point. Let's assume the person opening the $100 envelope is a millionaire, so the utility of whatever he wins will be proportional to the amount.

Quote:WizardIt depends on how or why Doc called tails. If he chose tails randomly with a 50% chance, then he would be right, and his probability of winning the toss on the biased coin would be 50%.

Yes, 50% is special, because p+(1-p)=1.

Well, I guess that went right over my head. Your equation is true for any value of p, so I don't see how it makes 50% so special.Quote:weaselmanYes, 50% is special, because p+(1-p)=1.

As for what I intended in initially proposing the biased coin analogy, I was thinking of it as a one-time coin flip, with me making a random guess heads or tails -- sort of like getting one chance to make an initial choice of envelopes. If we repeated the coin toss many times and I stuck with a guess of tails while the coin kept coming up heads, I think I would eventually catch on that the coin was biased.

Quote:DocWell, I guess that went right over my head. Your equation is true for any value of p, so I don't see how it makes 50% so special.

As for what I intended in initially proposing the biased coin analogy, I was thinking of it as a one-time coin flip, with me making a random guess heads or tails -- sort of like getting one chance to make an initial choice of envelopes. If we repeated the coin toss many times and I stuck with a guess of tails while the coin kept coming up heads, I think I would eventually catch on that the coin was biased.

Doc, you and I are approximately the same height or else you are much taller, because it went right over my head too.

YOU might eventually figure out the bias, but some of us wouldn't. As an example, I keep going back to the casino, even though I believe there is a very small chance they MIGHT be cheating.

With cash in hand, and a cost to play the second round, the original parameters do not apply.

Quote:DocWell, I guess that went right over my head. Your equation is true for any value of p, so I don't see how it makes 50% so special.

If you choose heads or tails with a 50% probability, then you have 50% of winning with any value of p because 0.5*p + 0.5*(1-p) = 0.5

If you use any other strategy (like always picking tails, or always picking heads or anything other that 50/50), then you will only have a 50% chance of winning when the coin is unbiased.

Quote:DocActually, I usually flip a coin to decide which way to call a coin toss. :-)

That works ... as long as your coin is unbiased :)

Suppose the problem were slightly reworded. There are two envelopes containing cash. Without either envelope being opened, we are told that one of them contains $100. We are also told that the other envelope contains either $50 or $200, but we are not given any information about the probability of those two possibilities. Can we calculate an expected value of the money in the 50/200 envelope? I don’t think so; what would be the basis?

If I then open an envelope and find $100 (bringing us back to the original problem), it does not give me any additional reason at all to think that there is $200 (or $50) in the other envelope -- it's still that there could be either $50 or $200 with unknown probability. In either case, I had a 50% probability of selecting the $100 envelope, but that does not mean that there is a 50% probability of $50 vs. $200 in the other envelope. I still cannot calculate the expected value of the money in the unopened envelope.

This was perhaps weaselman’s point, which I probably initially misinterpreted.

If he meant that we can't assume a 50% probability of initially choosing the envelope with more money (which is what his first sentence above sounds like), then I disagree. If he meant we can’t assume that there is a 50% probability that the 50/200 envelope contains $200 (which is more the way his last sentence sounds), then I agree. I think the EV cannot be calculated with the info we have.Quote:weaselmanIt is not 50% that you picked the higher valued envelope. In general, it just an unknown value.

It is a well-known fallacy to treat any unknown binary value as 50% of each.

Conclusion for the original conundrum: Pick an envelope and run. You don’t even need to open it until later.

Quote:DocSuppose the problem were slightly reworded. There are two envelopes containing cash. Without either envelope being opened, we are told that one of them contains $100. We are also told that the other envelope contains either $50 or $200, but we are not given any information about the probability of those two possibilities. Can we calculate an expected value of the money in the 50/200 envelope? I don’t think so; what would be the basis?

This is a vastly different problem than the original problem presented. In this case, we have additional information that makes the choice much easier.

In the original problem, we didn't know the amounts in advance, so we had no way of knowing if we had just opened the fixed amount envelope, or the +100%/-50% envelope. In this case, we do. We know for sure that one envelope contains $100, and the other envelope contains either $200 or $50.

If we open an envelope and it contains $200, we take it.

If we open an envelope and it contains $50, we switch. The other envelope contains $100.

If we open the envelope with $100, we should switch because the expected value of the $200/$50 envelope is $125 ($200 + $50 / 2). So, we are trading a sure $100 for an expected value of $125.

In the original problem, the expected value of one envelope is X, and the expected value of the second envelope is 125% of X. IF we knew which envelope we were opening (either X or X*1.25), then we could make a decision. As it stands, there is not enough information to make an informed decision and I can see no advantage to switching based upon the original wording.

The classic Monte Hall problem also has this same kind of additional information injected into the problem. The additional information in that problem is that the door exposed after the initial choice was not random, but rather it was known in advance that that door did not contain the prize.

Quote:weaselmanYes, 50% is special, because p+(1-p)=1.

Here is what I think he means. Let's say Doc has a 50% chance of calling tails, and the probability the coin lands on tails is t. Let x be the probability Doc wins

x=.5*t + .5*(1-t)

Next multiply both sides by 2

2x = t + (1 - t)

We see from the weaselman equation that p + (1-p) = 1, so it must be true for t as well...

2x = 1

x=1/2.

Then again, we just could get the same answer by canceling the .5t and -.5t terms, leaving x=0.5.

To this point I agree with you -- I was not clear enough in my statement. Restate it this way: In the original problem, you do not originally know the amounts in either envelope. Once you open one envelope and find $100, then you know that the other envelope contains either $50 or $200, but you have no basis for knowing what the probability of either of those two possibilities is. I contend that you are then in exactly the same situation as in my modified problem where you are told the amounts before the envelopes are opened.Quote:scotty81This is a vastly different problem than the original problem presented. In this case, we have additional information that makes the choice much easier.

In the original problem, we didn't know the amounts in advance, so we had no way of knowing if we had just opened the fixed amount envelope, or the +100%/-50% envelope. In this case, we do. We know for sure that one envelope contains $100, and the other envelope contains either $200 or $50.

If we open an envelope and it contains $200, we take it.

If we open an envelope and it contains $50, we switch. The other envelope contains $100.

This is where I disagree with you completely. You divided by 2. What basis do you have for believing that there is a 50/50 chance of $200 and $50? As weaselman pointed out, the fact that there are only two possibilities does not suggest that they are equally likely. I contend that you have been given no basis whatsoever for dividing by 2. It could as well be that there is a 25% chance of $200 and a 75% chance of $50. In reality, it is not even a probabilistic event -- it is deterministic and was determined by the person who put the money in the envelopes. I don't think we have any information as to which they would have chosen or what the probabilities are for how they would make their choice.Quote:scotty81If we open the envelope with $100, we should switch because the expected value of the $200/$50 envelope is $125 ($200 + $50 / 2). So, we are trading a sure $100 for an expected value of $125.

I agree with your conclusion. I'm just not sure I follow your line of thought in this paragraph. I don't follow what you mean by "the expected value of one envelope is X", and I contend that there is no basis for the 1.25 factor. But you draw your conclusion without regard to what this factor is -- it could as well be 3.5, and you would still realize that you have no justification to switch. So we come to the same final conclusion.Quote:scotty81In the original problem, the expected value of one envelope is X, and the expected value of the second envelope is 125% of X. IF we knew which envelope we were opening (either X or X*1.25), then we could make a decision. As it stands, there is not enough information to make an informed decision and I can see no advantage to switching based upon the original wording.

Quote:Wizard

I agree, but it is hard to explain why.

Why is it hard? It is not unusual that the probability changes after the outcome is partially revealed. In particular, if you knew that only one envelope contained money to begin with, the probability of finding it in one of the envelopes would be 50% before you start the experiment, and it would change to either 0 or 1 after an envelope is open.

The same exact thing happens here - while both envelopes are sealed, each has 50% probability to contain more money than the other, but once you open it, the probability changes, just like in the other case. The only difference is that the new value of probability is unknown, and we are psychologically inclined to assume that it is 50% because we don't know what it is.

Doc: You are right about 50% initial probability. All I meant when I said that 50% was special was that if you pick head or tails with a 50% probability, you will always have a 50/50 chance of guessing right, no matter how biased the coin is, and if you choose any other rule, than your chance of winning will no longer be 50% unless the coin is unbiased. It's a minor point, and not really related very much to the problem at hand. I regret that I ever brought it up ...

Quote:weaselmanwhile both envelopes are sealed, each has 50% probability to contain more money than the other, but once you open it, the probability changes, just like in the other case. The only difference is that the new value of probability is unknown, and we are psychologically inclined to assume that it is 50% because we don't know what it is.

I disagree that the probability of second envelope containing more than the opened one changed when the first envelope was opened. As the problem was laid out, one envelope contains more than the other. That fact has not changed with the unsealing of the first envelope. What you still don't know is whether you hold the higher or lower amount. The 1.25 EV is connected to the fact that half the time you will win $200, and half the time you will win $50. ($200+$50)/2 = $125 However, it costs $100 to play.

What if the opening of the envelopes were reversed, and the other envelope was opened to reveal $100. Is the value of trading an unknown envelope for one with a known amount different? Would you trade your sealed envelope (which you now know contains $50 or $200), for the sure $100?

Quote:AyecarumbaI disagree that the probability of second envelope containing more than the opened one changed when the first envelope was opened.

Do you agree it changes in the other case, when you know that only one envelope contains money? Before you open an envelope, the probability of finding $100 in either of them is 50/50, right? Once one envelope is opened, the probability changes to 0/100.

This demonstrates, that the event of opening an envelope changes the probabilities. The only difference is that in one case you know the new values, and in the other case they become unknown. That does not mean they have not changed, in some cases they will remain 50/50 (like in example I showed before with a uniform distribution, when the amount you found is below the middle of the interval), and in the others they won't (when the amount is above the middle). They may change or they may not, depending on the amount you found, and the distribution, since the latter is not known, it is impossible to tell what the new probabilities are, the 50/50 guess is no better than any other number.

Quote:WizardWhat bothers me about this problem too is that I have not had that "a-ha" moment either. At the core of this I think it is an abuse of the expected value formula to come up with an EV of the other envelope of $125. Still, I can't point to a specific reason why. It has something to do with the 50% and 100% being applied to two different amounts, not the same one.

This mathematical enigma is beautiful yet annoying. I have to agree with the Wizard here despite the popular evidence that suggests a 50/50 probability that creates a higher expected value. I think the EV formula is being abused.

1.) Once you choose the envelope, you have a finite event. Great you have $100. You can choose to lock this in or walk away.

2.) The second event is that you are presented with a choice. Choose the other envelope or keep the one you have.

----There's a real cost to this, which I believe the EV formula just doesn't factor in. Suppose the 2nd envelope was really an IOU in which it says you either get an extra $100 or you are required to pay $50. It's roughly the same logic and same final payoff for both scenarios, yet there is a negative expected value to this equation if you apply the math. EV = .5*100 + .5*(-50) = $25

3.) So far, most widely debated answers seem to suggest that once a finite event 1 occurs, you still have a 50/50 chance for event 2 to occur. I disagree with this interpretation, because these are not real random events. This to me is like abusing order of operation or independent vs. dependent variables.

Consider the following scenario:

A.) Suppose you have 3 envelopes with $50, $100, and $200. You choose 1 envelope and low and behold it contains $100. Now you are presented with a 2nd event. I take away your envelope and allow you to pick b/t the 2 leftover. 2 envelopes left, I can get 50 or 200. Ie, for the 2nd event you have a true path dependent variable with an EV of 50% of 50 and 50% of 200 = $125. It should make some sense that EV changes. Prior to the first event, the EV is 33% of 50, 33% of 100, and 33% of 200 = EV of 116.6(before 1st choice). Once you take away the envelope, then you have a true 50/50 probability. But if you have a choice to keep the original envelope or exchange it, then your EV has not truly changed in my opinion. Unfortunately, even this is hotly contested.

Perhaps this is all semantics as one other member described it.

Quote:Asswhoopermcdaddy...Once you take away the envelope, then you have a true 50/50 probability. But if you have a choice to keep the original envelope or exchange it, then your EV has not truly changed in my opinion. Unfortunately, even this is hotly contested.

I don't think I'm following the part quoted. Are you saying that if I have the choice of keeping my $100 envelope or choosing one of the other two, that my EV of choosing is not $125?

If there is in fact a 50/50 chance that switching will either double of halve his money, then it can be replaced by a coin. Taking out the aspect that the person received the money for free and you get the following example.

A person has $100 and is approached by a stranger. The stranger asks the man if he is willing to wager half the money he has on a coin flip. If the man chooses correctly, he doubles his money, however if the man is incorrect, he halves it. There is no difference whether he chooses heads or tails. If the man flips the coin and covers it up with his palm and asks the man heads or tails, the man could switch 1000 times between heads or tails, and the probability doesn't change.

I believe the error comes with the percentages. In the equation y = 0.5*(x/2) + 0.5*(2x), wouldn't the plus sign indicate logic OR, however the wording indicates logic AND? I believe the correctly equation would be y = 0.5*(x/2 * 2x), which would indicate 50% chance of halving AND a 50% chance of doubling.

Also I would like to note that by opening the envelope, no new information is added. You know you have an amount of money, however you still have no clue if you chose the higher or lower amount.

in the Monte problem, the host removes one of the losers. You now have a 50/50 chance of getting it right, and by taking into consideration that you had a greater chance of getting it wrong before, you switch.

Quote:WizardWhat bothers me about this problem too is that I have not had that "a-ha" moment either.

See what happens when all you mathy types try to explain a simple concept with equations? I am truly boggled by all the (metaphorically speaking) running around in confusion here--several dozen posts that seem to me to be getting more and more bewildered. Since this is actually a "DUH!" problem, where is mathematics failing here?

Perhaps it's because you can't express the following sentence as an equation:

No matter what envelope our hero eventually does open, that envelope will STILL be randomly chosen, no matter what extent or manner of paroxysms lead up to that event--therefore there will be no expected gain (or loss) by switching.

If I were indeed mathy (rather than wordy), I might try something that sums all the possible gains and losses from the different strategies, which might wind up as being the sum of an infitite series or something with that bizarre symbol that looks like a Z or something involving taking the derivative of something else, or maybe phlogiston, or...

So much simpler to just default to common sense, which says that switching CANNOT increase or decrease the expected value, and the relative differential between the amounts in the two envelopes is just a red herring.

Quote:mkl654321

So much simpler to just default to common sense, which says that switching CANNOT increase or decrease the expected value, and the relative differential between the amounts in the two envelopes is just a red herring.

The question at hand is not should you switch, but what is the flaw in the expected value argument that says the value of the other envelope is 0.5*(50+200)=125? It is a worthy question for debate, and famous paradox in mathematics.

Well, that would suggest that y(the expected value) is equal to half of x squared. If x is in dollars, is y in dollars squared? I think you need to re-consider your math; there are some errors in it.Quote:TriplellI believe the correctly equation would be y = 0.5*(x/2 * 2x)

I'm starting to enjoy this thread, now that I have begun to believe that I understand the problem. I would like to offer two lines of discussion:

#1. For those people who firmly believe that the fact the mystery envelope must contain either $50 or $200 means it has an expected value of $125, I offer you this opportunity. I have an envelope that I absolutely guarantee you contains either exactly $50 or exactly $200. What do you think is the expected value of the contents of the envelope? What would you offer for the contents of the envelope? Would you offer me $100? If you bargain real hard, I might be willing to part with it for $80, or even $75. How about it?

No, I don't think this forum is likely to have many suckers who would fall for that. I presented that a bit too blatantly, and most people who started off thinking they believed the $125 figure would have retrenched. After they rethink it, I expect the folks who believe in the $125 figure for the original problem would probably claim that it has something to do with the 50% chance of selecting on the first try the envelope with $100 vs. the mystery envelope. For those folk, I offer a different line of thinking.

#2. Suppose this game is going to be played numerous times, similar to the way Monte Hall played his game. Let's assume the game is offered on TV by a host named Bob. Bob cannot always offer the same two amounts in the envelopes, or people would get wise and know whether to swap envelopes or not. So Bob needs some plan for deciding how much to put in the envelopes for the next game. Let's assume he has a rule that he always follows but which he doesn't reveal to the audience.

You get to be a participant in Bob's game. You choose an envelope, open it, and find $XX. This could be $100 as in the original problem, but just on a whim, I'm going to suggest that your envelope contains $120. You realize that your mystery envelope contains either $60 or $240. You calculate the expected value of this mystery envelope as 0.5*($60+$240)=$150. (This is, of course, 1.25 times the amount you found in the envelope you opened.) Do you switch envelopes? Do you feel that it is obvious that, on average, the mystery envelope must be worth more than the $120 in your first envelope?

Before Bob asks, "Is that your final answer?", perhaps you should consider this. Suppose Bob's secret rule for deciding how much money to place in the envelopes is this: "The lesser-valued envelope should contain a random amount of money between $40 and $79, and the other envelope should contain twice as much." In that case, if we actually knew about this secret rule, we would always swap if our first envelope contained $79 or less and never swap if it contained $80 or more.

Now we don't know whether that is really Bob's rule, but it could well be. This rule, or most any such rule, establishes conditions where it is obvious that 0.5*(2X)+0.5*(X/2) is not a good method for estimating the value of the mystery envelope. It shouldn't take too much thought to realize that we don't have any basis for the assumption that we multiply both 2X and X/2 by 0.5 -- yes, there are only two possibilities for what might be in the mystery envelope (just as there are only two possibilities for the contents of the mystery envelope in the offer I presented in #1 above), but there is no reason to think they are equally likely. (Are you sure you don't want to offer $80 for my envelope?)

The truth is, we have no basis for knowing whether the factors should be 0.5 and 0.5, 0.9 and 0.1, 1.0 and 0.0, or anything else. If we don't know Bob's rule, we have no basis for deciding whether to swap or not. That is why I earlier said, "Take an envelope and run. You don't even have to open it until later."

How many people, if any, found #1 and #2 to offer a convincing presentation? I'm trying to work on my atrophied instructional skills.

Quote:DocI'm starting to enjoy this thread, now that I have begun to believe that I understand the problem. I would like to offer two lines of discussion:

This problem is not related to the "Monte Hall" problem. Instead, it is most closely related to the "Deal or No Deal" where the player is given the option of switching on the very last two suitcases. He knows there are two values left, so should he switch? Also, in "Deal or No Deal" the banker offers less than the expected value until the last two suitcases, then he always offers *more* than the expected value for the final offer. What's with that???

--Dorothy

Quote:DorothyGaleInstead, it is most closely related to the "Deal or No Deal" where the player is given the option of switching on the very last two suitcases. He knows there are two values left, so should he switch?

But there he also knows which values are left, and the probabilities of finding each in each of the suitcases (50%), if I am not mistaken about the rules of that game, so, mathematically, the decision is a "nobrainer", he knows the expected value, and should switch if it is more than what he already has.

At this point, the psychology takes over math - if I have a $1000, I will, probably, give it up for a 50% shot at a million, but if it is, say $300,000 ... I know, I should still go for it mathematically, but, probably, would not anyway.

Quote:WizardI don't think I'm following the part quoted. Are you saying that if I have the choice of keeping my $100 envelope or choosing one of the other two, that my EV of choosing is not $125?

I think Doc might have better explained my line of reasoning. Without knowing the true basis behind what's in those other envelopes, a 50/50 probability may represent 2 paths, but not the true EV. In which case, wouldn't the true EV revert back to 116.6? The fact that you are allowed to keep a known value can be inferred as an alternative probability.

What is the probability of you choosing the $100 or the 2nd envelope which contains either 50 or 200. Would it be subsequently fair under the same logic to say EV = .5*100 + .5{[.5*50+.5*200]} = 112.5? Math seems to work out, but we'd be confusing the definitions of expected value with the most probable value. Paths maybe binary, but the weights are not.

Gotta love the Monte Hall problem. There is a variation on wikepedia, that may be helpful:

http://en.wikipedia.org/wiki/Monty_Hall_problem

Wizard: Can we all assume from this point on that the original problem assumes a probability of 50% for the contents of the second envelope?

As near as I can tell, the problem remains the same, and the Wizard's paradox is still intact.

Quote:scotty81I'd like to get past the 50% ambiguity. The Wizard is the only one who can make this decision.

Wizard: Can we all assume from this point on that the original problem assumes a probability of 50% for the contents of the second envelope?

As near as I can tell, the problem remains the same, and the Wizard's paradox is still intact.

If you know the probability to be 50%, then the correct strategy is to switch, and there is no paradox. The paradox is rooted in assuming it is 50% when in fact it isn't.

Quote:scotty81I'd like to get past the 50% ambiguity. The Wizard is the only one who can make this decision.

Wizard: Can we all assume from this point on that the original problem assumes a probability of 50% for the contents of the second envelope?

As near as I can tell, the problem remains the same, and the Wizard's paradox is still intact.

I'm not sure I understand what you're asking. Personally, I don't think the key to this paradox is in analyzing whether the other envelope has a 0%, 50%, 100%, x% chance of having the smaller/larger amount. Rather, I think the flaw is in looking at the ratio of the two envelopes, rather than the difference. I've been thinking all morning how to put it in words, but still have not reached the "a-ha" moment, but think I'm getting closer.