scotty81
scotty81
Joined: Feb 4, 2010
  • Threads: 8
  • Posts: 185
August 14th, 2010 at 11:05:19 AM permalink
A lot of the discussion has focused on the possibility that the envelope with the smaller/larger amount does not have an equal chance of containing the smaller vs. larger amount. I just want to clarify that the spirit of the problem assumes that there is an equal chance for the smaller or larger amount, as opposed to the possibility that there may be a 90% chance for the smaller amount and a 10% chance for the larger amount.
Prediction is very difficult, especially about the future. - Niels Bohr
weaselman
weaselman
Joined: Jul 11, 2010
  • Threads: 20
  • Posts: 2349
August 14th, 2010 at 11:16:33 AM permalink
Quote: Wizard

Personally, I don't think the key to this paradox is in analyzing whether the other envelope has a 0%, 50%, 100%, x% chance of having the smaller/larger amount. Rather, I think the flaw is in looking at the ratio of the two envelopes, rather than the difference.


What do you mean? If I have X, and the other envelope has 2X, the difference is +X. If the other envelope has X/2, the difference is -X/2.
If the distribution is known, and the probability of finding 2X is p, then the expectation of the outcome of the swap is p*X - (1-p)*X/2 or pX*3/2 - X/2 or (3p-1)*X/2, so as long as p>1/3, the expectation is positive, and one should switch the envelopes.

No ratios anywhere, just the differences :)
"When two people always agree one of them is unnecessary"
weaselman
weaselman
Joined: Jul 11, 2010
  • Threads: 20
  • Posts: 2349
August 14th, 2010 at 11:21:50 AM permalink
Quote: scotty81

I just want to clarify that the spirit of the problem assumes that there is an equal chance for the smaller or larger amount, as opposed to the possibility that there may be a 90% chance for the smaller amount and a 10% chance for the larger amount.


Right, but that's exactly where the logical flaw leading to the paradox is hidden. It is not possible to create a distribution satisfying such property for any amount in the original envelope, because you cannot pick a random value from an infinite range such that the all values are equally likely to appear.
Try picking any rule you like of distributing the money between the envelopes, and you won't be able to satisfy the requirement that for any amount found in the first envelope, finding double and half that in the other one is equally likely. For any given rule, there will be situations when it is better to switch, and those when it is not.
"When two people always agree one of them is unnecessary"
mkl654321
mkl654321
Joined: Aug 8, 2010
  • Threads: 65
  • Posts: 3412
August 14th, 2010 at 1:07:38 PM permalink
Quote: DorothyGale

This problem is not related to the "Monte Hall" problem. Instead, it is most closely related to the "Deal or No Deal" where the player is given the option of switching on the very last two suitcases. He knows there are two values left, so should he switch? Also, in "Deal or No Deal" the banker offers less than the expected value until the last two suitcases, then he always offers *more* than the expected value for the final offer. What's with that???

--Dorothy



The banker has two often conflicting interests: to award the player as little as possible, and to extend the game as long as possible. The earlier offers (often, horrible in terms of EV) are calculated to fulfill the latter objective (since the banker cannot directly influence the former--if bad offers induce the player to continue rather thsan accpting an offer, then the player is acting properly, and the banker thereby loses EV, thus, in the early stages, drawing out the game must be more valuable). But when there are two cases left, the final segment of the game will take a little longer if the player refuses the offer, since Howie will then introduce the silliness of offering the player the choice of switching cases. Thus, in a situation where there are multiple games played on one show, the banker might drag it out (by offering a -EV amount); conversely, in a single-player game (like the multi-$1,000,000
games played a couple of seasons ago), the banker may wish to end things (which would allow a bit more time for blather, commercials, etc.).

I have also gotten the impression that the slightly +EV final offer was crafted as a "reward" for the player having gutted it out this far. I've noticed that when there is a large disparity between the final two amounts, the offer tends to be more neutral EV. It also seems to matter what happened immediately before--if the last three cases were BIG, BIG, SMALL, and the contestant just opened one of the big cases (thereby losing his "safety net"), the final offer won't be as good--applied psychology. This happens earlier in the game, too: when a player opens a large-amount suitcase, not only is the next offer lousy, it is lousier than it would have been otherwise.
The fact that a believer is happier than a skeptic is no more to the point than the fact that a drunken man is happier than a sober one. The happiness of credulity is a cheap and dangerous quality.---George Bernard Shaw
Doc
Doc
Joined: Feb 27, 2010
  • Threads: 45
  • Posts: 7086
August 14th, 2010 at 1:54:32 PM permalink
This morning I attempted to present a persuasive argument for my view, outlining cases #1 and #2. From the ensuing comments, it appears that I was stupendously ineffective in convincing anyone. Rather than giving up, I will follow the lead of Don Quixote and sally forth again with #3. This time I will even use the 50%/50% figures that seem to be popular and try to use them in the appropriate manner for calculating expected values. Here goes:

#3. The two envelopes contain amounts X and 2X. In your initial selection of an envelope, assuming you don't have any inside information about the envelope contents, you have a 50% chance of being "Lucky" and finding 2X in your envelope and a 50% chance of being "Unlucky" and only finding X. Of course, you will not know whether it is X or 2X. Let's consider these lucky and unlucky scenarios and then (not knowing whether we were lucky or unlucky) calculate the expected value of swapping or not swapping envelopes.

"Unlucky 1st Guess" "Lucky 1st Guess"
Probability 50% 50%
Initial value X 2X
Net effect of swapping +X -X
Net effect of not swapping +0 -0


Expected value of swapping = 50%*(+X) + 50%*(-X) = 0

Expected value of not swapping = 50%*(+0) + 50%*(-0) = 0

Here I believe I have properly calculated expected values using the 50% factors as they should be used. The calculations show that for any value of X (and 2X), players of the game cannot reasonably expect to gain or lose on average by either swapping or not swapping, although if you DO swap, you will either gain or lose X.

Now, does that make any sense to anyone? Is this case #3 the least bit persuasive?
weaselman
weaselman
Joined: Jul 11, 2010
  • Threads: 20
  • Posts: 2349
August 14th, 2010 at 2:02:51 PM permalink
Quote: Doc



Here I believe I have properly calculated expected values using the 50% factors as they should be used. The calculations show that for any value of X (and 2X), players of the game cannot reasonably expect to gain or lose on average by either swapping or not swapping, although if you DO swap, you will either gain or lose X.



This is the (yes, properly calculated) expected value BEFORE the envelope is opened. Once you know the amount in one of the envelopes, the expectations of swapping and not swapping change, but to find out the new values, we need to know the new probabilities.

I personally think, your previous explanation was perfect (and, way better than this one :)). #2 was right to the point.
"When two people always agree one of them is unnecessary"
mkl654321
mkl654321
Joined: Aug 8, 2010
  • Threads: 65
  • Posts: 3412
August 14th, 2010 at 2:41:36 PM permalink
Quote: Doc

Now, does that make any sense to anyone? Is this case #3 the least bit persuasive?



Good explanation---though I remain highly amused at the near-total inadequacy of mathematics to explain this psuedo-conundrum in coherent fashion. I say "pseudo-" because it is really a silly question, when you think about it--the envelope our player winds up with still contains some random amount, one of which is twice the other. The fact that the player opens the first envelope he chooses actually has NOTHING WHATSOEVER to do with the random nature of the final envelope he chooses (whether he retains the one he opens, or switches). In other words, choosing envelope A, looking, then switching to Envelope B, is EXACTLY EQUIVALENT to simply choosing Envelope B and taking whatever happens to be in there, just as choosing Envelope A, looking, and NOT switching is exactly equivalent to choosing Envelope A and never being offered, or never contemplating, switching.

At the risk of belaboring the mathy angle, I have a vague itchy feeling that the "conundrum", which arises from the illusion that your potential gain from switching exceeds your potential loss from doing so, is related to the mathy fraud of the reports from the Federal Department of Misleading Statistics, which said recently that the stock market had fallen 50% in 2008, but it regained 50% in 2009, so everybody's okay now.
The fact that a believer is happier than a skeptic is no more to the point than the fact that a drunken man is happier than a sober one. The happiness of credulity is a cheap and dangerous quality.---George Bernard Shaw
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1383
  • Posts: 23117
August 14th, 2010 at 3:14:02 PM permalink
I like Doc's way of looking at it. However, as a former math tutor, it is always easier to show the right way to solve a problem than to try to find the flaw in an incorrect solution. The harder question at hand is why doesn't the expected value formula work?

I'm still trying to find the best way to put this in words. The host says that one envelope has twice the other. Let's call the envelops L and H, for lower and higher. We know that L=0.5*H and H=2*L. Note how the 0.5 and 2 factors are applied to different envelopes.

So let's say you pick and envelope and it has $100. You can't say the other one has $50 or $200, because you're applying the 0.5 and 2 factors off of the same amount. That is just incorrect, they should be applied to different envelopes.

Here is a similar error. Suppose your share of MGM stock is worth $1. On Monday it goes up 100% and on Tuesday it goes down 50%. The value after both days is not $1 (original value) + $1 (gain on Monday) - $0.50 (loss on Tuesday) = $1.50. The correct value is $1*2*0.5 = $1. You have to apply multipliers off the correct amounts.

A-ha?
It's not whether you win or lose; it's whether or not you had a good bet.
Doc
Doc
Joined: Feb 27, 2010
  • Threads: 45
  • Posts: 7086
August 14th, 2010 at 3:51:46 PM permalink
Quote: Wizard

The harder question at hand is why doesn't the expected value formula work?

My contention is that the expected value formula works -- as long as you use it properly. The way it is used in the initial problem statement (post #1 of this thread) is trickery. It comes out nice and smooth, but it is incorrect and leads people to incorrect analysis from then on. In that initial writeup, it is presented this way:

Quote:

There should be a 50% chance that the other envelope contains either 2 * $100 = $200 (2x) or a 50% chance that the other envelope contains (1/2) * $100 = $50 (x/2). In such a case, the value of the envelope is:

$125 = 0.5*($100/2) + 0.5*(2*$100)


Well, the statement that "There should be a 50% chance that..." is just plain wrong. There is no reason at all to believe that. There are two possibilities, but nothing to indicate that they are equally likely. But it is presented so smoothly, a con man should be impressed, and it moves quickly to an equation that looks familiar to everyone who has ever calculated an expected value. But it's a sham.

Compare this to, "...Now each of the three men has paid $9 for a total of $27 and the bellhop has $2 so there is a grand total of $29..." That's another smooth sham, but it is not a true representation of that problem.

Both problems are worded to mislead people for the purpose of luring them into what appears to be a paradox. In the two-envelope problem, the error is that the 50% figures are not appropriate for the place they are used. They can indeed be used in the manner I described in version #3.
Garnabby
Garnabby
Joined: Aug 14, 2010
  • Threads: 4
  • Posts: 197
August 14th, 2010 at 4:09:56 PM permalink
Quote: mkl654321

... the near-total inadequacy of mathematics to explain this psuedo-conundrum in coherent fashion.




Because, as many of the great thinkers of our time have come to conclude, mathematics inherently mimics physics... sometimes we are left with "extraneous" solutions to describe what the physics has yet to reveal to us.

The answers to the (unresolvable) true and psuedo- paradoxes, in this case perhaps of the operations of simple addition and multiplication wrt probability theory, lay much deeper where these and other fields of study intertwine. (I mean eg, beginning with questions like, "Is probability more about addition than multiplication; and if not, what are addition and multiplication (on the physical level) in order for those to come together at all?")

This thread reminds me of the observation that a gambler who always bets half his stake will always lose any series of bets which are predetermined to even out in number, but not order.
Why bet at all, if you can be sure? Anyway, what constitutes a "good bet"? - The best slots-game in town; a sucker's edge; or some gray-area blackjack-stunts? (P.S. God doesn't even have to exist to be God.)

  • Jump to: