Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1383
  • Posts: 23117
August 13th, 2010 at 6:46:51 AM permalink
What bothers me about this problem too is that I have not had that "a-ha" moment either. At the core of this I think it is an abuse of the expected value formula to come up with an EV of the other envelope of $125. Still, I can't point to a specific reason why. It has something to do with the 50% and 100% being applied to two different amounts, not the same one. Kind of like your stock could go up 100% today, and down 50% tomorrow, and you would be right back where you started.

By the way, this is problem #6 on my math' rel='nofollow' target='_blank'>http://mathproblems.info/]math problems site. Still, I'm not entirely happy with my explanation there either, nor any others I've seen.
It's not whether you win or lose; it's whether or not you had a good bet.
weaselman
weaselman
Joined: Jul 11, 2010
  • Threads: 20
  • Posts: 2349
August 13th, 2010 at 6:51:15 AM permalink
Quote:


Part #1: I do not yet see why "50% that you picked the higher valued envelope to begin with" does not lead to "50% that the other envelope will be the lower valued one." I think it is more likely that the probabilities are being used improperly to calculate an expected value.



It is not 50% that you picked the higher valued envelope. In general, it just an unknown value.

It is a well-known fallacy to treat any unknown binary value as 50% of each. Like what is the probability that you see a dinosaurus on the street tonight? 50% - either you see it, or you do not :)

Quote: Doc



Part #2: The only twist I have come up with is that the amount of money that can be in an envelope is not a continuous function; money comes in discrete units. If you examine one envelope and find that the money in it could not be evenly divided into half (eg., $247.59), then you must have gotten the smaller amount and should swap. I'm not sure how to follow through on this notion in the event that your envelope contains a nice even amount.



We can agree to round to the nearest cent when we divide. Or better yet, to the nearest two cents, so that you'll never see odd amounts to begin with. :)

The actual twist is that the probabilities of different amounts are different, they cannot all be equal.

For example, suppose, you know that the person offering you the envelopes picked a uniformly distributed value between 0 and 1, doubled or halved it with a 50/50 probability, and then multiplied the result by a fixed dollar amount, that's known to you (say, $1000).
Obviously, you won't switch if you see more than 1000 in your envelope. In fact, in this case, you should not switch as long as your value is $500 or more, and should if it is less than $500.
The key is knowing the distribution - then you should be always be able to calculate when to switch.
"When two people always agree one of them is unnecessary"
Doc
Doc
Joined: Feb 27, 2010
  • Threads: 45
  • Posts: 7086
August 13th, 2010 at 9:26:41 AM permalink
Quote: weaselman

It is a well-known fallacy to treat any unknown binary value as 50% of each. Like what is the probability that you see a dinosaurus on the street tonight? 50% - either you see it, or you do not

I agree that "Something either happens or it doesn't happen" does not mean that it has a 50% chance of happening. But I disagree that this applies to the present problem.

Suppose that you have a very biased coin. It does not have anywhere close to a 50% probability of coming up heads (or tails). You flip the coin and ask me to guess the result. Even not knowing which direction the bias is in, I still have a 50% chance of getting it right. Similarly, I have a 50% chance of initially selecting the envelope with the larger amount of money.


Edit: Perhaps I misinterpreted your post and owe you an apology. On second reading, it occurs to me that your statement "fallacy to treat any unknown binary value as 50% of each" might have been a comment on the improper way that the EV was calculated in the initial post. If that is what you meant, then I agree with you -- at least in concept, although I don't know the proper way to calculate the EV, if there is one.
rdw4potus
rdw4potus
Joined: Mar 11, 2010
  • Threads: 80
  • Posts: 7236
August 13th, 2010 at 9:53:46 AM permalink
We used this problem in one of my classes. The point was that the decision doesn't depend on the math so much as it depends on the amount of money in the first envelope.

There is a point where, for each person, the amount in that envelope is "enough." If that's the case, it isn't worth the risk of swapping to get half of that amount because there isn't much marginal benefit associated with the potential to double the amount. Obviously this is subjective, and it's also situational. I think it's the exception to the Wiz's "thou shalt not hedge thy bets" commandment (life-changing amounts of money).

Say I'm planning a trip to Vegas in October, and airfare is pricing out at $550. We play this game and you give me 1 envelope. I open it and discover $600. I could either keep the $600 and have guaranteed free airfare, or trade it in for a theoretically +EV chance at $1200. Personally, I'd take the $600. That is enough to pay for my immediate and large expense.

We did a similar exercise with the lottery, with a 10%/1000% split for the second envelope. If you won $500,000 in the first envelope, would you trade it in for a chance at $50k/$5MM?
"So as the clock ticked and the day passed, opportunity met preparation, and luck happened." - Maurice Clarett
weaselman
weaselman
Joined: Jul 11, 2010
  • Threads: 20
  • Posts: 2349
August 13th, 2010 at 10:29:35 AM permalink
Quote: Doc


Suppose that you have a very biased coin. It does not have anywhere close to a 50% probability of coming up heads (or tails). You flip the coin and ask me to guess the result. Even not knowing which direction the bias is in, I still have a 50% chance of getting it right. Similarly, I have a 50% chance of initially selecting the envelope with the larger amount of money.



This is not true. At least, not always true. It will depend on your strategy, and the actual distribution. For example, in the case of a biased coin, if you always say "tails", your chance of being right will be less or more than 50% depending on which way the bias is. If the coin always shows "heads", you will never be right at all.
With the envelopes, it is a little different. Because they are identical, before the envelope is open, your chance of picking one with the larger amount is 50%, but when one of the amounts becomes known, the probabilities change, it is no longer 50% that your envelope has the larger amount, the actual value depends on the distribution.
"When two people always agree one of them is unnecessary"
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1383
  • Posts: 23117
August 13th, 2010 at 10:42:19 AM permalink
Quote: weaselman

This is not true. At least, not always true. It will depend on your strategy, and the actual distribution. For example, in the case of a biased coin, if you always say "tails", your chance of being right will be less or more than 50% depending on which way the bias is. If the coin always shows "heads", you will never be right at all.



It depends on how or why Doc called tails. If he chose tails randomly with a 50% chance, then he would be right, and his probability of winning the toss on the biased coin would be 50%. Doc would also be right if the probability that the coin was biased in favor of heads and tails were equal.

However, if Doc always chose tails, and the coin were a store bought biased coin, then I think Doc's probability of winning would be more than 50%. This is because most people call heads. So I could envision somebody making a biased coin in favor of tails, as a cheating device.



Quote: weaselman

With the envelopes, it is a little different. Before the envelope is open, your chance of picking one with the larger amount is 50%, but when one of the amounts becomes known, the probabilities change, it is no longer 50% that your envelope has the larger amount, the actual value depends on the distribution.



I agree, but it is hard to explain why.
It's not whether you win or lose; it's whether or not you had a good bet.
Wizard
Administrator
Wizard
Joined: Oct 14, 2009
  • Threads: 1383
  • Posts: 23117
August 13th, 2010 at 10:48:19 AM permalink
Quote: rdw4potus

We used this problem in one of my classes. The point was that the decision doesn't depend on the math so much as it depends on the amount of money in the first envelope...



That is a valid point. However, I think getting into the utility of money is confusing the issue and getting off point. Let's assume the person opening the $100 envelope is a millionaire, so the utility of whatever he wins will be proportional to the amount.
It's not whether you win or lose; it's whether or not you had a good bet.
weaselman
weaselman
Joined: Jul 11, 2010
  • Threads: 20
  • Posts: 2349
August 13th, 2010 at 11:01:12 AM permalink
Quote: Wizard

It depends on how or why Doc called tails. If he chose tails randomly with a 50% chance, then he would be right, and his probability of winning the toss on the biased coin would be 50%.


Yes, 50% is special, because p+(1-p)=1.
"When two people always agree one of them is unnecessary"
Doc
Doc
Joined: Feb 27, 2010
  • Threads: 45
  • Posts: 7086
August 13th, 2010 at 11:42:36 AM permalink
Quote: weaselman

Yes, 50% is special, because p+(1-p)=1.

Well, I guess that went right over my head. Your equation is true for any value of p, so I don't see how it makes 50% so special.

As for what I intended in initially proposing the biased coin analogy, I was thinking of it as a one-time coin flip, with me making a random guess heads or tails -- sort of like getting one chance to make an initial choice of envelopes. If we repeated the coin toss many times and I stuck with a guess of tails while the coin kept coming up heads, I think I would eventually catch on that the coin was biased.
Headlock
Headlock
Joined: Feb 9, 2010
  • Threads: 22
  • Posts: 315
August 13th, 2010 at 11:50:41 AM permalink
Quote: Doc

Well, I guess that went right over my head. Your equation is true for any value of p, so I don't see how it makes 50% so special.

As for what I intended in initially proposing the biased coin analogy, I was thinking of it as a one-time coin flip, with me making a random guess heads or tails -- sort of like getting one chance to make an initial choice of envelopes. If we repeated the coin toss many times and I stuck with a guess of tails while the coin kept coming up heads, I think I would eventually catch on that the coin was biased.



Doc, you and I are approximately the same height or else you are much taller, because it went right over my head too.

YOU might eventually figure out the bias, but some of us wouldn't. As an example, I keep going back to the casino, even though I believe there is a very small chance they MIGHT be cheating.

  • Jump to: