Quote:WizardPersonally, I don't think the key to this paradox is in analyzing whether the other envelope has a 0%, 50%, 100%, x% chance of having the smaller/larger amount. Rather, I think the flaw is in looking at the ratio of the two envelopes, rather than the difference.

What do you mean? If I have X, and the other envelope has 2X, the difference is +X. If the other envelope has X/2, the difference is -X/2.

If the distribution is known, and the probability of finding 2X is p, then the expectation of the outcome of the swap is p*X - (1-p)*X/2 or pX*3/2 - X/2 or (3p-1)*X/2, so as long as p>1/3, the expectation is positive, and one should switch the envelopes.

No ratios anywhere, just the differences :)

Quote:scotty81I just want to clarify that the spirit of the problem assumes that there is an equal chance for the smaller or larger amount, as opposed to the possibility that there may be a 90% chance for the smaller amount and a 10% chance for the larger amount.

Right, but that's exactly where the logical flaw leading to the paradox is hidden. It is not possible to create a distribution satisfying such property for any amount in the original envelope, because you cannot pick a random value from an infinite range such that the all values are equally likely to appear.

Try picking any rule you like of distributing the money between the envelopes, and you won't be able to satisfy the requirement that for any amount found in the first envelope, finding double and half that in the other one is equally likely. For any given rule, there will be situations when it is better to switch, and those when it is not.

Quote:DorothyGaleThis problem is not related to the "Monte Hall" problem. Instead, it is most closely related to the "Deal or No Deal" where the player is given the option of switching on the very last two suitcases. He knows there are two values left, so should he switch? Also, in "Deal or No Deal" the banker offers less than the expected value until the last two suitcases, then he always offers *more* than the expected value for the final offer. What's with that???

--Dorothy

The banker has two often conflicting interests: to award the player as little as possible, and to extend the game as long as possible. The earlier offers (often, horrible in terms of EV) are calculated to fulfill the latter objective (since the banker cannot directly influence the former--if bad offers induce the player to continue rather thsan accpting an offer, then the player is acting properly, and the banker thereby loses EV, thus, in the early stages, drawing out the game must be more valuable). But when there are two cases left, the final segment of the game will take a little longer if the player refuses the offer, since Howie will then introduce the silliness of offering the player the choice of switching cases. Thus, in a situation where there are multiple games played on one show, the banker might drag it out (by offering a -EV amount); conversely, in a single-player game (like the multi-$1,000,000

games played a couple of seasons ago), the banker may wish to end things (which would allow a bit more time for blather, commercials, etc.).

I have also gotten the impression that the slightly +EV final offer was crafted as a "reward" for the player having gutted it out this far. I've noticed that when there is a large disparity between the final two amounts, the offer tends to be more neutral EV. It also seems to matter what happened immediately before--if the last three cases were BIG, BIG, SMALL, and the contestant just opened one of the big cases (thereby losing his "safety net"), the final offer won't be as good--applied psychology. This happens earlier in the game, too: when a player opens a large-amount suitcase, not only is the next offer lousy, it is lousier than it would have been otherwise.

#3. The two envelopes contain amounts X and 2X. In your initial selection of an envelope, assuming you don't have any inside information about the envelope contents, you have a 50% chance of being "Lucky" and finding 2X in your envelope and a 50% chance of being "Unlucky" and only finding X. Of course, you will not know whether it is X or 2X. Let's consider these lucky and unlucky scenarios and then (not knowing whether we were lucky or unlucky) calculate the expected value of swapping or not swapping envelopes.

"Unlucky 1st Guess" | "Lucky 1st Guess" | |
---|---|---|

Probability | 50% | 50% |

Initial value | X | 2X |

Net effect of swapping | +X | -X |

Net effect of not swapping | +0 | -0 |

Expected value of swapping = 50%*(+X) + 50%*(-X) = 0

Expected value of not swapping = 50%*(+0) + 50%*(-0) = 0

Here I believe I have properly calculated expected values using the 50% factors as they should be used. The calculations show that for any value of X (and 2X), players of the game cannot reasonably expect to gain or lose on average by either swapping or not swapping, although if you DO swap, you will either gain or lose X.

Now, does that make any sense to anyone? Is this case #3 the least bit persuasive?

Quote:Doc

Here I believe I have properly calculated expected values using the 50% factors as they should be used. The calculations show that for any value of X (and 2X), players of the game cannot reasonably expect to gain or lose on average by either swapping or not swapping, although if you DO swap, you will either gain or lose X.

This is the (yes, properly calculated) expected value BEFORE the envelope is opened. Once you know the amount in one of the envelopes, the expectations of swapping and not swapping change, but to find out the new values, we need to know the new probabilities.

I personally think, your previous explanation was perfect (and, way better than this one :)). #2 was right to the point.

Quote:DocNow, does that make any sense to anyone? Is this case #3 the least bit persuasive?

Good explanation---though I remain highly amused at the near-total inadequacy of mathematics to explain this psuedo-conundrum in coherent fashion. I say "pseudo-" because it is really a silly question, when you think about it--the envelope our player winds up with still contains some random amount, one of which is twice the other. The fact that the player opens the first envelope he chooses actually has NOTHING WHATSOEVER to do with the random nature of the final envelope he chooses (whether he retains the one he opens, or switches). In other words, choosing envelope A, looking, then switching to Envelope B, is EXACTLY EQUIVALENT to simply choosing Envelope B and taking whatever happens to be in there, just as choosing Envelope A, looking, and NOT switching is exactly equivalent to choosing Envelope A and never being offered, or never contemplating, switching.

At the risk of belaboring the mathy angle, I have a vague itchy feeling that the "conundrum", which arises from the illusion that your potential gain from switching exceeds your potential loss from doing so, is related to the mathy fraud of the reports from the Federal Department of Misleading Statistics, which said recently that the stock market had fallen 50% in 2008, but it regained 50% in 2009, so everybody's okay now.

I'm still trying to find the best way to put this in words. The host says that one envelope has twice the other. Let's call the envelops L and H, for lower and higher. We know that L=0.5*H and H=2*L. Note how the 0.5 and 2 factors are applied to different envelopes.

So let's say you pick and envelope and it has $100. You can't say the other one has $50 or $200, because you're applying the 0.5 and 2 factors off of the same amount. That is just incorrect, they should be applied to different envelopes.

Here is a similar error. Suppose your share of MGM stock is worth $1. On Monday it goes up 100% and on Tuesday it goes down 50%. The value after both days is not $1 (original value) + $1 (gain on Monday) - $0.50 (loss on Tuesday) = $1.50. The correct value is $1*2*0.5 = $1. You have to apply multipliers off the correct amounts.

A-ha?

My contention is that the expected value formula works -- as long as you use it properly. The way it is used in the initial problem statement (post #1 of this thread) is trickery. It comes out nice and smooth, but it is incorrect and leads people to incorrect analysis from then on. In that initial writeup, it is presented this way:Quote:WizardThe harder question at hand is why doesn't the expected value formula work?

Quote:There should be a 50% chance that the other envelope contains either 2 * $100 = $200 (2x) or a 50% chance that the other envelope contains (1/2) * $100 = $50 (x/2). In such a case, the value of the envelope is:

$125 = 0.5*($100/2) + 0.5*(2*$100)

Well, the statement that "There should be a 50% chance that..." is just plain wrong. There is no reason at all to believe that. There are two possibilities, but nothing to indicate that they are equally likely. But it is presented so smoothly, a con man should be impressed, and it moves quickly to an equation that looks familiar to everyone who has ever calculated an expected value. But it's a sham.

Compare this to, "...Now each of the three men has paid $9 for a total of $27 and the bellhop has $2 so there is a grand total of $29..." That's another smooth sham, but it is not a true representation of that problem.

Both problems are worded to mislead people for the purpose of luring them into what appears to be a paradox. In the two-envelope problem, the error is that the 50% figures are not appropriate for the place they are used. They can indeed be used in the manner I described in version #3.

Quote:mkl654321... the near-total inadequacy of mathematics to explain this psuedo-conundrum in coherent fashion.

Because, as many of the great thinkers of our time have come to conclude, mathematics inherently mimics physics... sometimes we are left with "extraneous" solutions to describe what the physics has yet to reveal to us.

The answers to the (unresolvable) true and psuedo- paradoxes, in this case perhaps of the operations of simple addition and multiplication wrt probability theory, lay much deeper where these and other fields of study intertwine. (I mean eg, beginning with questions like, "Is probability more about addition than multiplication; and if not, what are addition and multiplication (on the physical level) in order for those to come together at all?")

This thread reminds me of the observation that a gambler who always bets half his stake will always lose any series of bets which are predetermined to even out in number, but not order.

Here is the "solution" to this fun little conundrum. Keep in mind that when given your choice, you know that you will always end up with a number above zero as your final result. This is important to note because if your options were "double or nothing" then it would be much easier to make your decision. I'll use a few scenarios to show why this is so. I'm also going to begin the first 2 scenarios by having you hold an envelope (the one you first choose) that is worth $100. We are dealing with finite numbers here (regardless of what some may suggest) and $100 is a nice, round number. One more point that is crucial here is that your benefactor can afford to part with any amount of money. To say that the amount of the first envelope will give some indication of what may be in the second envelope is not so... at least according to the facts introduced in Dorothy's original post. (Sorry, I know it's long...)

SCENARIO #1 (which assumes that your options are "double or nothing". I know that this is not the case in this particular problem but follow me and it should help you see the actual problem in the right light.)

- When you choose your first envelope and BEFORE you look inside, your benefactor tells you that it is worth $100 to you. He goes on to say that neither envelope contains any money but that each one has a slip of paper. Upon one piece of paper is the figure "0%" and on the other, "200%". You now have two options; take the $100 or switch envelopes. It is easy to see that if you were to switch an even amount of times (whether you do it twice or one million times) your EV is precisely 100%. You also know that if you change envelopes that there is PRECISELY a 50% chance of the envelope being the "0%" option and PRECISELY a 50% chance of it being the "200%" option. If you select the "0%" you end up with nothing and if you select the "200%" you end up with $200. Do this an even amount of times and your EV remains at 100% of your $100. Therefore it makes no sense to change envelopes any amount of times as you already have made $100 in the bargain. IF this were the problem at hand it is easy to see why you should not change your envelope UNLESS you are enough of a gambler to take the chance of doubling your money at the risk losing everything. (Sounds like a good ol' coin flip to me with a double or nothing outcome... an even money proposition - no more, no less.) The point here is that switching envelopes does not have a +EV therefore you should not switch... the best you could do in the long run is to break even and the worst you could do in one trial is to lose everything.

SCENARIO #2 (Now we get closer to the real problem at hand using the facts introduced by Dorothy in her original post; A switch will either double your amount or cost you 50% of that amount)

- When you choose your first envelope and BEFORE you look inside, your benefactor tells you that it is worth $100 to you. He goes on to say that neither envelope contains any money but that each one has a slip of paper. Upon one piece of paper is the figure "50%" and on the other, "200%". You now have two options; take the $100 or switch envelopes. It is easy to see that if you were to switch an even amount of times (whether you do it twice or one million times) your EV is precisely 125%. THIS IS BECAUSE ALTHOUGH YOUR ODDS OF CHOOSING THE LARGER AMOUNT ARE THE SAME AS CHOOSING THE SMALLER AMOUNT (precisely 50%), THE LARGER AMOUNT INCREASES YOUR TOTAL BY 200% WHERE THE SMALLER AMOUNT ONLY REDUCES YOUR TOTAL BY ONLY 50%.

***** Let's look at this by putting it into practice. If I am offered this opportunity once and I change envelopes then I will either end up with $200 or $50. The median amount is $125. Therefore if I were to be offered this opportunity twice and I changed both times there's a good chance that I would select the smaller amount once and the larger amount once. I'd end up with $50 one time and $200 the next for a total of $250. This would give me an average gain of $125 between the two trials. Over an infinte number of trials I'd simply get closer to this +EV of 125%. If I had kept the $100 both times I'd end up with $200 for an average of a $100 gain between the two trials. You can see that my EV for switching is 125% and for staying my EV is 100%. (Akin to the casino taking commission on a winning Pai Gow bet... when you lose they take 100% of your money and when you win they take 5% of your money. In this scenario when I choose the small amount I lose only 50% of my money but when I choose the larger amount I win 100% of my money) Now THERE'S a coin flip I'd take every time! *****

Do this any amount of times and your EV remains at 125% of your $100. Therefore it makes sense to change envelopes ONCE as you have a +EV. As I said earlier, we are dealing with a finite amount and to suggest that swapping envelopes more than once would give you a +EV for every switch is a fallacy. IF YOU SWITCHED, AFTER YOU'VE SWICHED THE FIRST TIME, SWITCHING AGAIN CAN ONLY PRODUCE THE OPPOSITE RESULT AS THERE ARE ONLY 2 OPTIONS. The point here is that switching envelopes has a +EV therefore you should switch - but only once as switching again would put you back where you began. The odds of choosing the greater or lesser amount will always be 50/50 but half of the time you will choose the larger amount and INCREASE YOUR TOTAL BY TWICE AS MUCH as you'd lose if you chose the lesser amount.

SCENARIO #3 (And now we use Dorothy's actual problem with the facts as she presented them. We will use real amounts and find that again we are dealing with finite figures and options.)

- You are offered two envelopes. You are told that one envelope contains an amount that is precisely twice what is in the other. (It therefore stands to reason the one envelope contains an amount the is precisely 50% of what is in the other.) When you choose your first envelope and look inside, you see a $100 bill. Your benefactor tells you that you may keep the $100 or change envelopes. It is easy to see (by looking at SCENARIO #2) that if you were to switch an even amount of times (whether you do it twice or one million times) your EV is precisely 125%. You also know that if you change envelopes that there is PRECISELY a 50% chance of the new envelope having $50 and PRECISELY a 50% chance of it containing $200. At this point you realize that your benefactor is willing to part with up to $100 OR up to $200.

***** Let's look at this by putting it into practice. There are only two possibilities; the two envelopes contain either $50 and $100 or they contain $100 and $200... there are no other options. If I'm holding an envelope that contains $100 then I know I have the larger or the smaller amount and there is an EQUAL CHANCE of either probability. I know that by changing I will either lose $50 or gain $100. In SCENARIO #2 we have already shown that I should change envelopes as my EV is 125%. It has also been shown that changing more than once is not of ANY benefit to me as I'd simply be back to where I began, with my original envelope with $100. As we know what's in the original envelope we chose it's rather silly to think that by changing back to this original envelope after having changed envelopes once the amount will somehow have miraculously increased... nope, you just got the same original envelope with the same $100 you originally had. *****

This is more a logic puzzle than a math problem. The math tells you that you have a +EV by changing envelopes only one time. Once you realize that there are finite options then the math becomes moot and logic has to dictate your actions. The "flaw in the logic" is that it is easy to look at the odds (50/50) and the EV (125%) and wonder how they can be different... and then spend hours trying to figure out why they are. It's obvious that they are different because they define two wholly separate facts. One (the odds) defines the chances of choosing a higher or lower amount and the the other (the EV) defines what the result will be when you've made your choice. I suggest that by changing an EVEN AMOUNT of times will give you a +EV but the fact is that you will always have a +EV. I simply wished to let you work out the numbers for yourself by using 2 (or any even amount) of trials as a simple proof. Dorothy's original formula holds true but as we are dealing with finite numbers and a finite number of possibilities then the EV is also finite at 125%.

THE END.

Quote:TheNightflyHello all - Here is the "solution" to this fun little conundrum... [infinitely long explanation ... let's just say it's the first ordinal larger than countably infinite ] ... THE END.

Zounds!

Surely someone can help me scrape my exploded brain off the wall ...

--Dorothy

Quote:TheNightflyHere is the "solution" to this fun little conundrum...

That is a fancy explanation about why the EV=$125 argument is allegedly correct. However, it doesn't make common sense that it is. Suppose Bill Gates is hosting the contest and the two envelops contain $50 and $100. He then offers the contest to thousands of people individually. By your argument everybody would switch. However, would switching benefit the group as a whole? The half that picked $50 would switch to $100, and the half that switched to $100 would end up with $50. Overall there would be an average increase in wealth of 25% per person of (ave(100%,-50%)), but the total amount of money won would stay the same. Sorry, but you feel into the trap.

Quote:WizardThat is a fancy explanation about why the EV=$125 argument is allegedly correct. However, it doesn't make common sense that it is. Suppose Bill Gates is hosting the contest and the two envelops contain $50 and $100. He then offers the contest to thousands of people individually. By your argument everybody would would switch. However, would switching benefit the group as a whole? The half that picked $50 would switch to $100, and the half that switched to $100 would end up with $50. Overall there would be an average increase in wealth of 25% per person of (ave(100%,-50%)), but the total amount of money won would stay the same. Sorry, but you feel into the trap.

Well, I think this is a trap that most people fall into. After all, we keep stuffing politicians into office whose ideological shibboleth is that taking money from some people and giving it to other people ACTUALLY INCREASES THE TOTAL AMOUNT OF SAID MONEY. Taxation, like alchemy, creates something out of nothing! (I beg your pardon--not "taxation", "revenue creation". My bad.) But since we keep sending the same clowns back over and over, and in REALLY delusional times, such as recently, we send a whole gang of them, I think that is manifest proof that we as a society have no idea how the monetary system/the economy works.

P.S. I just put a dollar bill in one desk drawer and two dollar bills in another. I then proceeded to alternately open and close each drawer. To my total surprise, after only fifteen minutes of doing that, the top desk drawer contained a billion dollars! Maybe you're wrong after all, Wizard.

P.P.S. After I discovered the billion dollars, I got greedy, reasoning that the bottom drawer must now contain 1.25 billion dollars. So I closed the top drawer and opened the bottom--but that act caused the amount in the top drawer to increase by 312.5 million. The increased weight was too much: the desk crashed through the floor, and I fell into the resulting hole. I woke up in the emergency room, and when I got back to the wreckage of my house, I found that someone had taken all the money. Live and learn!

weaselman: I missed this post of yours last night. Sorry. Didn't mean to ignore it. I think you and I are pretty much in agreement on this problem. I do disagree with some aspects of your comments on my post.Quote:weaselmanQuote:Doc

Here I believe I have properly calculated expected values using the 50% factors as they should be used. The calculations show that for any value of X (and 2X), players of the game cannot reasonably expect to gain or lose on average by either swapping or not swapping, although if you DO swap, you will either gain or lose X.

This is the (yes, properly calculated) expected value BEFORE the envelope is opened. Once you know the amount in one of the envelopes, the expectations of swapping and not swapping change, but to find out the new values, we need to know the new probabilities.

I personally think, your previous explanation was perfect (and, way better than this one :)). #2 was right to the point.

Things I did do:

(1) State a 50% probability of being lucky or unlucky in your first random guess of an envelope.

(2) Calculate expected values of the net effects of swapping and not swapping envelopes.

Things I did not do:

(1) State any probability for any particular amount of money being in the mystery envelope.

(2) Calculate an expected value of the contents of the mystery envelope.

I contend that the probabilities I stated are correct both before and after you open the envelope. After you open it, either you were or you were not lucky (you still don't know which), but the probability didn't change.

I also contend that the expected values that I calculated (effect of swapping or not swapping) do not change after you open the envelope. They both stay at zero. The amount of money that you might gain or lose by swapping changes once you have a dollar value to work with, but the expected value of swapping is still zero. So I guess we disagree on that point.

We both agree (I think) that the probabilities of an amount of money in the mystery envelope in unknown and unknowable without some inside information. Perhaps those probabilities do change when you open an envelope, but I made no claim about them.

As for which of my explanations (#1, #2, or #3) is best, I think that depends upon the initial viewpoint of the person I am explaining it to. That is why I tried to show it different ways.

Now, I am off to Mississipi for a week of practical experiments on a subject of interest to most of us. Hope you folks have a lot of fun on this forum while I am gone.

Quote:mkl654321After all, we keep stuffing politicians into office...

Your rant on politics is so significantly off topic that it should have been made into a separate thread. Please copy and paste it into a separate thread if you wish this to remain on the board.

Quote:WizardSuppose Bill Gates is hosting the contest and the two envelops contain $50 and $100. He then offers the contest to thousands of people individually. By your argument everybody would switch. However, would switching benefit the group as a whole? The half that picked $50 would switch to $100, and the half that switched to $100 would end up with $50. Overall there would be an average increase in wealth of 25% per person of (ave(100%,-50%)), but the total amount of money won would stay the same. Sorry, but you feel into the trap.

I agree with the Wizard here. Suppose I were Satan and offered you those three envelopes, but instead called the prizes: Hell, Purgatory, and Heaven. Then you pick the envelope for Purgatory. Aww shucks. I offer you the opportunity to choose one or the other envelopes. Do you really risk it because the expected value is slightly positive? Afterall, the expected value is a concept that goes hand in hand with the Central Limit Theorum, as you run through multiple iterations to approach an "average" value. In this instance, you only get 1 shot at heaven or hell.

Now what if I were really sneaky. What if I made this offer with only 2 envelopes. You choose 1 = Purgatory, and I tell you the second envelope contains either Hell or Heaven. You would assume your scenario is purely binary with a 50% probability. But I'm Satan, and I've made this offer many times to many souls. And I know, That for every 1 envelopes filled with Heaven, and 3 envelopes are filled with Hell. Your 50% probability isn't looking too hot right now. Bahahaha.

Quote:Doc

I contend that the probabilities I stated are correct both before and after you open the envelope. After you open it, either you were or you were not lucky (you still don't know which), but the probability didn't change.

I also contend that the expected values that I calculated (effect of swapping or not swapping) do not change after you open the envelope. They both stay at zero.

This is where we disagree. Suppose, the person stuffing the envelopes, follows this rule - pick a uniformly distributed number between 0 and 1, multiply it by $1000, and put the amount into the first envelope, then either double the amount or divide it by two with a 50/50 chance, and put the result into the second envelope.

Suppose, after opening the first envelope, you see $1026 in it. What is the probability, that the second envelope has more money? It is 0. What if you see $950? The probability of the other one having more is 1/3 now. What if it's $400? Now it's 50%.

This illustrates, how the probabilities and expectations change depending on the first amount unveiled. I did pick an arbitrary distribution, but it is not important - the same kind of mechanics happens for any distribution. It's much like the blackjack player's strategy changing depending on the dealer's up cards.

Quote:AsswhoopermcdaddyDo you really risk it because the expected value is slightly positive?

That depends on the monetary values you chose to assign to your soul going to each of those places. I suspect, for most people, the expected value will not be positive. If it is positive however, then yes, mathematics says that you should do it, all the other, non-mathematical reasons some might state for not doing it, are irrelevant (not because they are unimportant, but because they are unique for each individual, and cannot be generalized or formalized). I personally think, that if you do decide to not swap even though the expectation is positive, it simply means that the numerical values you have assigned to each of the outcomes are incorrect in your case (provided, of course, that you understand well what exactly you are doing, and all the math behind it, and are not simply acting on an impulse).

Quote:

Now what if I were really sneaky. What if I made this offer with only 2 envelopes. You choose 1 = Purgatory, and I tell you the second envelope contains either Hell or Heaven. You would assume your scenario is purely binary with a 50% probability.

Just like seeing a dinosaurus on the street :) Either you see it, or you do not. 50% :)

Quote:weaselmanpick a uniformly distributed number between 0 and 1 ...

Off topic, but this is not possible. There is no pseudo-random generation algorithm for picking a "random number" from any continuum. Indeed, except for a very small number of values between 0 and 1, there is no way of even describing most of these numbers algorithmically. Moreover, it is impossible to pick an integer at random. Why? because, with countably infinite many exceptions, individual integers are too large to describe in any language.

Densely yours in NP world,

--Dorothy

Quote:DorothyGaleOff topic, but this is not possible. There is no pseudo-random generation algorithm for picking a "random number" from any continuum.

Who said "continuum"? Or pseudo-random for that matter? :)

Quote:

Indeed, except for a very small number of values between 0 and 1, there is no way of even describing most of these numbers algorithmically.

This is true

Quote:because, with countably infinite many exceptions, individual integers are too large to describe in any language.

This is not true. A set of all possible finite combinations of words is isomoprphic to the set of all integers (or rationals).

Quote:AsswhoopermcdaddySuppose I were Satan and offered you those three envelopes, but instead called the prizes: Hell, Purgatory, and Heaven...

I think I would prefer purgatory to heaven anyway, so I'd stick with that. Along the lines of gambling on eternity, I predict it is just a matter of time before somebody brings up Pascal's Wager -- hopefully in a separate thread.

Quote:weaselmanA set of all possible finite combinations of words is isomoprphic to the set of all integers (or rationals).

Yes, every rational can be decomposed as a product of Pi^ei for some p's and e's (e's can be negative for rationals). Treat the primes as the atomic words. So what? That's just the fundamental theorem of arithmetic. But, even then you can't describe them. The p's and e's themselves get too large to describe. And using your logic, the e's would also need to be similarly decomposed. Because of the problem describing the e's, the description you mention is "circular" in its logic.

But hey, you're pretty smart 8-)

--Dorothy

Quote:DorothyGaleThe p's and e's themselves get too large to describe.

No, you can describe any integer in a finite number or words.

Quote:But hey, you're pretty smart 8-)

Yeah, I know, but thanks anyway :)

Quote:weaselmanNo, you can describe any integer in a finite number or words.

Now, does this help in any way in picking an integer at random?

--Dorothy

Quote:DorothyGaleNow, does this help in any way in picking an integer at random?

--Dorothy

Nope. You said yourself, it was off topic :)

Quote:WizardI think I would prefer purgatory to heaven anyway...

Excellent choice.

First, I would say that the obvious solution is to grab both envelopes and run like hell.

Assuming that's not a possibility, I'm afraid that I don't understand why there is a problem in the expected value of switching being 125%. I'm going to toss out the concept of infinitely switching envelopes, since the problem does not seem to indicate that this is a possibility. You get one and only one switch. Instead, I guess I look at it as either a game show concept, or maybe from the point of view of a casino. Let's run the game 100 times with 100 different people, and give them all the same choice, and assume a 50% probability of the envelope they switch to being twice as much and a 50% probability that it is half as much. Again, assume that their initial pick is an envelope of $100.

If all the people switch, then 50% of them will end up with $200, and 50% of them will end up with $50. The casino running the game will have to pay out $12,500 in total, or an average of $125 per person. I might assume that half of the people, not understanding the mathematics behind the game, would choose to keep the initial envelope. In other words, 50 people end up with $100, 25 people end up with $200, and 25 people end up with $50. All told, the casino would have to pay out $11,250, or an average of $112.50 per person.

I think, mathematically, everyone agrees that the expected value is 125% and thus, mathematically, the switch should be made. The only other problem I can think of that people have is that it doesn't seem logical to switch the envelopes. However, I contend that it is logical. Mainly because the concept of doubling your money greatly outweighs the concept of halving your money. The reason for this, at least in my mind, is that doubling your money is a lot harder to achieve than halving your money.

Let's say you took $100 and put it into some sort of compound interest bearing account earning 5% per unit of time. It would take 15 such units of time to reach $200. On the other hand, if it lost 5% per unit of time, compounded, you would be down to under $50 in 14 such units of time. Take out compounding, and you reach $200 in 20 units ($5 gained per unit), or you are down to $50 in 10 unit ($5 lost per unit).

Think of gambling, where you start with $100 and your stop goals are either $50 or $200, whichever comes first. You're going to play $5 a hand blackjack. Obviously, it's a lot easier to lose $50 than it is to win $100. But let's say you take a completely fair game, like flipping a fair, non-biased coin. Heads you win, Tails you lose. Wouldn't math show that the odds of getting 10 Tails in a row far outweigh getting $20 Heads in a row? (And yes, I know that there may be other up and down movements in there, but the point remains that it's a lot easier to go down by 1/2 then it is to go up to twice as much.)

Thus, from MY logical point of view, the chance that the envelope you switch to could contain twice as much money as the envelope you just opened, seems like a great choice, given how difficult it would be to take the money in your hand and double it.

P.S. And through the first few pages of reading, I was thinking of Pascal's Wager, but since the Wizard has forbidden it in terms of this problem, I left it out of my ramblings. :)

Quote:WizardYour rant on politics is so significantly off topic that it should have been made into a separate thread. Please copy and paste it into a separate thread if you wish this to remain on the board.

Relax, Wizard. Blow it away yourself if you want. I thought it was appropriate because the "problem" that is the subject of this thread isn't a problem at all, and the fact that it seems to be a conundrum is merely an artifact of our general inability to understand some of the basic dynamics of monetary interchange. This, in turn, leads us to believe in nonsensical promises re the economy made by our beloved leaders.

And allow me to clarify two definitions:

"Statement": A verbal or written expression that a given person agrees with or feels neutral about

"Rant": A verbal or written expression that a given person disagrees with

Quote:DorothyGaleNow, does this help in any way in picking an integer at random?

--Dorothy

For the fun of it, here is an algorithm to pick any integer at random.

1. Toss a coin. If heads, write down 0, else write down 1.

2. Toss a coin again. If heads, go to #1 else go to #3

3. Take the sequence of zeroes and ones you generated, treat is as an integer in binary notation and convert to decimal.

4. Toss a coin again. If heads, stick a minus sign in front of the number

This will generate a fairly steep distribution with a peak around zero, but you can play with #2 to make it as wide as you like (e.g, keep tossing the coin until you see ten heads in a row to stop).

Quote:konceptum...I think, mathematically, everyone agrees that the expected value is 125% and thus, mathematically, the switch should be made...

In this forum, so far, you and TheNightfly are the only ones to firmly take a stand that the 125% argument is in fact correct. I think everyone else can see that if you know you are going to switch to the other envelope, then you may as well pick the "other one" to begin with. The question is where is the fallacy in logic of the 125% argument. See also my response to the TheNightfly post.

Also, I don't forbid talking about Pascal's Wager, just make a new thread for it.

For those who may have read through my novelette from a couple of pages ago, I profoundly apologize. The Great and Powerful Wiz was right (of course) and yes, I fell into the trap. As a matter of fact, I pushed myself into it by beginning my analysis with SCENARIO #1 in which I imagined an example where the first envelope has no monetary amount and the two envelopes have a slip of paper that say "50%" and "200%" respectively. If that were the case (which it is not) then yes, there would be a positive EV however, since it is not the case I simply painted myself into a corner by believing my own flawed logic.

As there are only 2 envelopes there is a 50/50 chance of selecting the large one and a 50/50 chance of selecting the small one. If the two envelopes contain $50 and $100 respectively then once you've chosen the $100 envelope there is no possibility of this being the smaller of the two. Even though you still don't know which one you've chosen it doesn't change the fact that you've chosen the larger of the two. You may think that you have a chance to increase your total by trading envelopes but this is simply not the case. Once you've chosen the larger of the two a trade will only reduce your total by 50%. For the time when you select the smaller of the two a trade will increase your total by 100%.

The flawed logic presumes that once you've chosen an envelope that the other envelope could be a higher OR lower value than the one you hold. This cannot be the case as it must only be one or the other... there is not the possibility or potential for it to be both.

The Bill Gates example is a good way to show this and it has brought me to my "A-HA" moment.

Assuming that the two envelopes do hold $50 and $100, the median amount is $75 yet there is no $75 envelope. Selecting the $50 and switching will result mean that you will end up with $100. Selecting the $100 envelope and switching will mean that you end up with $50. Regardless of what the percentage gain or loss may be in relation to the original amount, the actual monetary gain or loss will be the same in both cases - $50. Although you might believe that once you've looked in your envelope and found $100 that you may have selected the smaller amount and have a chance of increasing your amount by $100, this is not the case. You can only go down and never up, even thought you don't know this fact. Therefore, to believe that you have a chance of increasing your amount by 100% doesn't make it so.... you cannot as the $200 envelope you are hoping to find does not exist.

Looking at the problem from the perspective of the person who is selecting the envelopes, you may convince yourself that whichever envelope you've chosen will always have a 50/50 chance of containing the smaller amount but you would be wrong. Only 50% of the time will you have a chance of increasing the amount you hold and that will ONLY happen when you have chosen the envelope containing the smaller amount. So, in the long run you will go up $50 half of the time and go down $50 half of the time. As $50 is 100% of $50, the increase or decrease will always be the same amount and the EV BETWEEN the two envelopes will be $75 (the total of $150 divided by two, the number of envelopes). In the long run, the person who is handing out the envelopes will lose an average of $75 per envelope but never at $75 per envelope but only in $50 and $100 increments.

The fallacy in logic is when you (or I) suggest that as this $50 gain is 100% of $50 and the $50 loss is only 50% of $100 that there is a percentage gain in the combination of the two of 50% (which divided by the two choices gives us an average increase of 25% per choice) or and EV of 125%. In actual fact the gain or loss will always be exactly $50 (which is 100% of $50) and in the long run you will end up making an average of $75 per selection, which is exactly $25 more than $50 and $25 less than $100. In other words, don't change the envelope because the first choice you made is just as likely to hold the larger amount as it is the smaller and changing will not make it any more likely that you will then be holding the larger amount.

I know I've gone on much too long once again but I tend to use 10 words where 5 will do and this is the result.

Thanks to Dorothy for providing this puzzle and thanks Wiz for gently pointing out my error and leading me to the truth.

What if you think of it like this. There are two envelopes. Envelope A contains $x, and envelope B contains $2x.

Once you have picked an envelope, you are able to take that amount. The question is whether or not you should switch. So let's look at it from the point of view of what you gain or what you lose.

If you originally picked A, and don't switch, your gain is $0.

If you originally picked A, and do switch, your gain is $x.

If you originally picked B, and don't switch, your gain is $0.

If you originally picked B, and do switch, your gain is -$x, or a loss of $x.

If you consider that all four of those combinations are possible, then the expected gain is:

(1/4)*0 + (1/4)*x + (1/4)*0 + (1/4)*(-x) = 0.

If you say that the person should always switch envelopes, then the expected gain still comes out to $0. So, I guess what I'm saying is that the reason the problem doesn't seem to make sense is that we're thinking in terms of the amount that is originally found in the first envelope, when maybe all we should be looking at is what is the possible gain by switching.

Quote:konceptum...maybe all we should be looking at is what is the possible gain by switching.

You're welcome on the help. As they say, the best arguments are the ones you know you lost, because at least you learned something.

Your argument in favor of no gain by switching is completely correct. Doc argued the same thing earlier. However, how would you explain where the logical flaw is in averaging a gain of 100% and a loss of 50%, and coming up with an expected profit of 25% by switching? THAT is the puzzle.

Quote:WizardYou're welcome on the help. As they say, the best arguments are the ones you know you lost, because at least you learned something.

Your argument in favor of no gain by switching is completely correct. Doc argued the same thing earlier. However, how would you explain where the logical flaw is in averaging a gain of 100% and a loss of 50%, and coming up with an expected profit of 25% by switching? THAT is the puzzle.

I may have what you're driving at:

The logical flaw is in attempting to average those two numbers at all. Percentages of variables are NOT integers--they are unknowns (or, more precisely, derivatives of unknowns). I think I was skirting the concept when I said that a drop of 50% in the stock market followed by a gain of 50% does NOT restore prices to their former levels--in other words, the percentages are not equivalent, even though numerically equal, because they refer to different quantities. In the envelope game, a gain of 100% over the SMALLER amount is exactly equivalent (as an absolute number) to a loss of 50% from the GREATER amount. To average the numeric values of the percentages is to create GIGO.

The total value placed in the envelopes is either 3x or 3x/2. Since we do not know which set we are in at this point, we should weight the expected values accordingly. The total value in both scenarios is 4.5x, 3x represents 2/3 and 3x/2 represents 1/3.

Now, there is a 50% chance that we are in the realm of 3x and the gain is x2-x, weighted by 2/3.

50%(x2-x)/0.666... = 0.75x

There is a 50% chance that we are in the realm of 3x/2, which is a gain of x/2-x, weighted by 1/3.

50%(x/2-x)/0.333... = -0.75x

The fact that we do not know how much money is placed in the envelopes at the outset creates the disparity.

Not sure if this makes sense, but would appreciate the feedback.

K

Quote:WizardYou're welcome on the help. As they say, the best arguments are the ones you know you lost, because at least you learned something.

Your argument in favor of no gain by switching is completely correct. Doc argued the same thing earlier. However, how would you explain where the logical flaw is in averaging a gain of 100% and a loss of 50%, and coming up with an expected profit of 25% by switching? THAT is the puzzle.

I know that a major part of my problem in understanding the logic behind this puzzle is that I would always choose to switch envelopes. My reasoning for this is based loosely upon Pascal's Wager, and the fact that there is nothing to lose.

Without looking at the contents of the envelope, it would seem obvious that the envelope you choose makes no difference, and switching the envelopes makes no difference either. The expected value of the envelope you chose is (3/2)x. The expected value of the other envelope is also (3/2)x. This makes sense since (3/2)x + (3/2)x = 3x, which is the total value of the monies placed in the envelope. Since each envelope has the same expected value, then it doesn't matter which one you choose, and switching them doesn't make sense.

The logical flaw, I would think, comes about in the false concept that choosing the other envelope can result in "doubling" or "halving" your money. The fact remains that you do not yet have the money, until you decide whether to keep the envelope you have, or take the other envelope. And until you make the decision, the expected value of the envelope in hand is (3/2)x and the expected value of the envelope you could switch to is also (3/2)x.

There is an analogy to this in the stock market or housing market. People will say that they have lost a lot of money in their stocks, but until they actually sell those stocks, they haven't lost anything. The paper value of those stocks is meaningless, until it is actually turned into cash. The same is true with the envelopes. Until you turn one envelope in and cash it in, it has no value, other than an expected value of (3/2)x.

Even if you know the value of the envelope, it doesn't change. Let's say you buy a stock at $40 per share and the stock drops to $20 per share. You can't claim that you have lost money, because you haven't turned that stock in for cash yet. Thus, you haven't lost anything until you do so. Similarly for the envelope. If you open it and it contains $100, you can't say that you have a 50% chance of doubling your money or a 50% of halving your money, because it isn't yet your money. It won't be your money unless you decide to keep it, in which case you will neither double nor halve it. Further, if you switch envelopes, you will get whatever is in that envelope. Since you never had the $100 to begin with, you neither doubled nor halved by switching.

All that being said, I would still utilize Pascal's Wager and switch no matter what.

Quote:KelmoI wrote the quote in the initial thread ...

Welcome to the board Kelmo. Glad to have you. As you can tell, your question has raised a lot of very interesting discussion.

As for smart, well, at least I can sing ...

--Dorothy

Quote:DorothyGaleWelcome to the board Kelmo. Glad to have you. As you can tell, your question has raised a lot of very interesting discussion.

As for smart, well, at least I can sing ...

--Dorothy

Thanks for introducing me to this website. It's gold!

K

Using the numerals 1,1,1,1 (once each) and any mathematical symbols (any number of symbols any number of times), what's the largest number you can express?

Examples:

1+1+1+1=4

1+1*11=22

11*11=121

11^11 (eleven to the eleventh power)

Quote:Nareed

Using the numerals 1,1,1,1 (once each) and any mathematical symbols (any number of symbols any number of times), what's the largest number you can express?

How about 1/[1-(1/1)] = infinity?

Quote:WizardHow about 1/[1-(1/1)] = infinity?

I'm not sure. Are you dividing by zero? ASCCI is no good for math.

The answer is infinity, though it's usually reached differently.

OK, using 1 four times:

A(111,1) --

--Dorothy

Quote:WizardHow about 1/[1-(1/1)] = infinity?

hmmmm. Looks more like a Divide By Zero Overflow error to me. ;-)

Quote:NareedHint: !

Factorial is, perhaps, non-standard. If you are thinking 1111! then lets go with A(111!,1) (A=Ackerman).

The problem is that the more math you learn, the more symbols become "standard."

--Dorothy

Quote:DorothyGaleFactorial is, perhaps, non-standard. If you are thinking 1111! then lets go with A(111!,1) (A=Ackerman).

Simpler: (((1111!)!).....!)=Infinity

The flaw is in assuming you have always chosen the envelope containing X. In fact, there is only a 50% chance you have chosen the envelope with X. We are ignoring what happens if you chose the envelope that has either 2*X or .5*X

In order not to get too abstract, let's look at the EVs from the standpoint that one envelope contains $100 and the other envelope has a 50% chance of containing either $50 or $200. Furthermore let's consider the EV from the standpoint of what is to be gained or lost with each combination.

Case 1: 50% chance you choose the envelope with $100

50% chance you will gain $100 by opening the other envelope

50% chance you will lose $50 by opening the other envelope

Total EV weight for Case 1:

25% chance you will gain $100 (50% * 50%)

25% chance you will lose $50

Case 2: 25% chance that you chose an envelope with $200

100% chance you will lose $100 by opening the other envelope

Total EV weight for Case 2:

25% chance you will lose $100 (25% * 100%)

Case 3: 25% chance that you will chose an envelope with $50

100% chance you will gain $50 by opening the other envelope

Total EV weight for Case 3:

25% chance you will gain $50 (25% * 100%)

The summary of the EVs for all possible cases of switching are:

25% chance you will gain $100

25% chance you will lose $50

25% chance you will lose $100

25% chance you will gain $50

Total EV (in terms of expected gain) for switching envelopes: $0

The flaw is in the assumption that the middle envelope is always chosen.

Quote:NareedSimpler: (((1111!)!).....!)=Infinity

I think the spirit of the question requires at most one use of any mathematical operation.

I am not sure if 1111! is greater or less than (11)^(11!). Maybe it's obvious ...

--Dorothy

Quote:DorothyGaleI think the spirit of the question requires at most one use of any mathematical operation.

Perhaps the spirit does, but the letter of the question does not:

Quote:Using the numerals 1,1,1,1 (once each) and any mathematical symbols (any number of symbols any number of times), what's the largest number you can express?

So there :P

Quote:DorothyGaleI think the spirit of the question requires at most one use of any mathematical operation.

For what it's worth, the four fours problem asks you to come up with every number from 1 to 100 using exactly four fours. You are allowed to use the following opperators: +, -, *, /, sqrt(), ^, !, and a bar (I don't know make one) that denotes a repeating decimal. You may use them as often as you wish.

For example, you could make 15 as follows (44/4)+4. I think those are reasonable limitations. The ackerman function is too esoteric, in my opinion.