Kelmo
Kelmo
Joined: Aug 15, 2010
  • Threads: 6
  • Posts: 85
August 19th, 2010 at 6:40:00 PM permalink
Try this as a computer simulation:

Generate a random outcome between 0 and 1 to represent a value placed in one of the evelopes (x).
Based on that random outcome create another value that is twice that amount to represent the other envelope (2x).
Generate a second random outcome between 0 and 1 representing the initial choice of the player, where if that outcome is >0.5 the first envelope is chosen and if it is <0.5, the second envelope is chosen.
Subtract the unchosen envelope from the chosen one for the change in wealth.

Loop that a substantial number of times counting each decision and divide the accumulated wealth by the number of decisions.

Should trend to 0.

Seems to me that this would simply be 0.5(x-2x)+0.5(2x-x) = 0.
jfalk
jfalk
Joined: Sep 2, 2010
  • Threads: 2
  • Posts: 29
September 2nd, 2010 at 11:06:07 AM permalink
This problem has indeed stumped a lot of people, but the answer is fairly simple. The reason you can't do an expected value calculation here is that there is no proper (ie integrates to 1) distribution with the feature that the probability of 2x is equal to the probability of x except for the somewhat silly distribution which is zero with probability 1. Thus, since the expected value is the sum (or integral) of r x prob(r), it is only defined when the distribution is defined. A simple argument shows why this is true. The distribution can't be finite, since anything near the top of the distribution will have some probability and 2 times that number has to have a probability of zero, since it's finite. But there is no infinite distribution with constant positive density, because it wouldn't sum to 1. Of course, the constant density with prob(0)=1 is an exception which is meaningless in this context. Hope this helps.

Note, by the way, that if the distribution is finite, the math works out fine, since you now have values in the envelope in which the probability of 2 times that value is zero. (Any value greater than half the finite limit on the distribution). I leave as an exercise for the reader to show that in this case switching envelopes is a wash.
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1248
  • Posts: 20591
September 2nd, 2010 at 11:45:13 AM permalink
Quote: jfalk

...The reason you can't do an expected value calculation here is that there is no proper (ie integrates to 1) distribution with the feature that the probability of 2x is equal to the probability of x ...



I've read that argument before. You've stated it well, and I agree that is a flaw in the EV=1.25x argument. However, is an understanding of integral calculus necessary to see the light? My gut tells me that there should be a way to debunk the EV argument with just simple algebra.

p.s. Welcome to the forum, I hope you'll stick around (I know this guy, he is very smart).
It's not whether you win or lose; it's whether or not you had a good bet.
mkl654321
mkl654321
Joined: Aug 8, 2010
  • Threads: 65
  • Posts: 3412
September 2nd, 2010 at 11:57:51 AM permalink
Quote: Wizard

I've read that argument before. You've stated it well, and I agree that is a flaw in the EV=1.25x argument. However, is an understanding of integral calculus necessary to see the light? My gut tells me that there should be a way to debunk the EV argument with just simple algebra.

p.s. Welcome to the forum, I hope you'll stick around (I know this guy, he is very smart).



You can also debunk it with third grade arithmetic, or simple common sense, in that if it were possible to add EV by switching, it would also be possible to increase the value of each envelope infinitely by infinitely switching. I would imagine that this intuitive conclusion is mirrored by the GIGO effect that happens when you plug "zero" into one of those complex calculations.

I do have an interesting question (which I am posting on a new thread) that is probably best solved by complex analysis, because the obvious answer seems counterintuitive.
The fact that a believer is happier than a skeptic is no more to the point than the fact that a drunken man is happier than a sober one. The happiness of credulity is a cheap and dangerous quality.---George Bernard Shaw
jfalk
jfalk
Joined: Sep 2, 2010
  • Threads: 2
  • Posts: 29
September 2nd, 2010 at 12:07:51 PM permalink
Thanks for the compliment, Wiz. I think the answer to your question is: "No." Although if you want to be technical about it, you don't actually need integral calculus, since the same logic applies for Riemann integration, eg., only allowing amounts to the penny, with rounding up to the nearest penny (which only trivially affects the EV calculation). I was going to mention this before, but the real comparison here is not with Monte Hall, but with the St. Peterburg paradox, ( http://en.wikipedia.org/wiki/St._Petersburg_paradox ) which only yields an infinite answer (abstracting from the marginal utility of money) when you allow Ponzi schemes to exist. In that case, the distribution is proper, but the distribution of rewards is "improper," or at least implausible, since there's no mathematical limit on returns. As a colleague of mine put it, the EV proof is equivalent to a proof in which you divide by zero in a subtle way somewhere along the way. If you're allowed to divide by zero, of course, you can prove any mathematical theorem you want, like 1=2. The only thing this EV calculation requires is that there's a distribution for which the density (or probability) of x = the density (or probability) of 2x in positive space, since that's the only thing you've specified about the problem. There is no such thing, so when you sum the probabilities times the returns, you've (in essence) divided by zero. You can't fix x before you make the EV calculation, since x represents a point along the distribution: you have to integrate across all x to solve the problem. If there was a distribution for which density(x) = density(2x) for all x, then that step is unneeded, so you leave it out... that's the "divide by zero" part.
DorothyGale
DorothyGale
Joined: Nov 23, 2009
  • Threads: 40
  • Posts: 639
September 2nd, 2010 at 12:40:39 PM permalink
Quote: mkl654321

Yes, it does. It means, "X is coming! RUN!!!"


STOP!!!...

--Dorothy
"Who would have thought a good little girl like you could destroy my beautiful wickedness!"
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1248
  • Posts: 20591
September 2nd, 2010 at 12:54:27 PM permalink
Quote: mkl654321

You can also debunk it with third grade arithmetic, or simple common sense, in that if it were possible to add EV by switching, it would also be possible to increase the value of each envelope infinitely by infinitely switching. I would imagine that this intuitive conclusion is mirrored by the GIGO effect that happens when you plug "zero" into one of those complex calculations.

I do have an interesting question (which I am posting on a new thread) that is probably best solved by complex analysis, because the obvious answer seems counterintuitive.



Yes, that makes perfect sense, but it is the easy way out. The question is where is the flaw in the EV=.5*(2x+0.5x)=1.25x arguement?

I look forward to sinking my teeth into your complex analysis problem.
It's not whether you win or lose; it's whether or not you had a good bet.
jonesq
jonesq
Joined: Sep 2, 2010
  • Threads: 0
  • Posts: 1
September 2nd, 2010 at 4:07:51 PM permalink
I have never fully understood this problem. Would it change the analysis if you knew that the distribution of potential numbers was a randomly (even distribution) picked integer from 1 to 100 (bingo style with no replacement), and the first envelope has $50 in it? Then the other envelope must have $100 or $25 in it with an equal distribution, meaning you should switch. Right?
Doc
Doc
Joined: Feb 27, 2010
  • Threads: 45
  • Posts: 6901
September 2nd, 2010 at 4:31:08 PM permalink
If you know that is the distribution, then you switch if the first envelope has $50 or less and don't switch if it has $51 or more. But you don't know that is the distribution. You cannot justify the bogus EV argument that you should always switch.
MrPogle
MrPogle
Joined: Nov 11, 2009
  • Threads: 1
  • Posts: 3
September 5th, 2010 at 3:47:17 AM permalink
I think I read the following once but I might have made it up, so apologies if it is nonsense. Try the following strategy when playing the game.

Before you open the envelope pick a random value you would like to win. If the envelope contains at least that amount, keep it but if it doesn't swap for the second one. There are three scenarios.

1) Both envelopes contain more than your target. You never switch but this does not effect your overall return.

2) Both envelopes contain less than your target. You always switch but this does not effect your overall return.

3) One envelope has more than your target and one less. In this case you will (without knowing) keep the 2x envelope if you pick it and swap the x envelope if you pick it (winning 2x every time).

Because some of the time 3) will occur, your overall return on the game should be fractionally better than 3/2x.

Shouldn't it?

  • Jump to: