Ok, I'm not great at math. The Wizard says that a betting system cannot change the house edge, so that is why the Martingale cannot work. Meaning if the game is positive EV, the Martingale cannot fail.
Here is the question that pretty much sums up why I have such trouble with the Martingale:
Roulette Wheel 1: 37 numbers - 0 - 36.
Roulette Wheel 2: 35 numbers - 1 - 35.
According to Wizard - If I Martingale on Roulette Wheel 1 with an infinite bankroll, betting 1-18 getting paid even money, I cannot win because of the house edge, and the fact that there can be an infinite number of 19-36 or 0.
However, if I Martingale on Roulette Wheel 2 with an infinite bankroll, betting 1-18 getting paid even money, I cannot lose, because numbers 19-35 cannot occur infinitely.
I hope this doesn't make me sound stupid, but how is this possible? I'll even take an answer from MKL.
It's late, and I'm sure when I wake up and see all the comments, I'll see how stupid everyone thinks I am. Oh well.
The point is that no matter what you are betting, it is possible for that section or those numbers NOT to hit for an infinite period of time. Infinitely unlikely, but possible. Mathematically, in a +EV game you will be ahead in the "long run" and in a -EV game you will be behind in the "long run" but in real life it is impossible to correctly predict an outcome.
I don't know if I've given you the answer you were looking for but that's the simplest explanation I can come up with. When you casually throw the concept of infinity into a math problem you usually end up with some pretty strange results and they don't always make sense.
If X is infinite, then an infinite number of losses will still occur at some time (over the course of infinity). Or to look at it another way, given an infinite set of results, some of those results will be a perfect losing record--enough to kill an infinite Martingale.
And thank you so much for accepting "even" my response :)
I'll grant you that with an infinite bankroll and no table limit, there is an extremely small chance that you'll die before finally breaking even.
But that chance does exist.
If you like, you can consider a much smaller number of spins. While the chance is still extremely small, it is not unlikely that you'll Martingale yourself up to the point where you're betting more money than exists in the world.
Hell, if you start with a $1 bet, you only need 50 losses before the bet is so big, that the average person can't say it, because they (like me) do not know what comes after "trillion": $1,125,899,906,842,624.
50 losses, while unlikely, is not unrealistic.
Bottom line is, whatever betting system you use, you cannot change the house edge. The size of your bankroll - infinite or not - affects only how soon and how badly you will lose, not the mere fact that you will.
(Edit: I have changed my point of view after giving this some more thought, and clarifying the exact terms of the problem. In a process, defined so that the player keeps doubling his bet as long as necessary until a given event occurs, and then stops immediately, there is only one possible outcome - a win of 1 unit by the player, and therefore an EV of such process is +1).
Nightfly and MKL: "Infinitely unlikely" and impossible means the same thing. There are no infinitely long streaks.
DJTeddyBear: After the trillion comes the quadrillion :) (seriously - four after three etc., you just divide the number of zeroes by three and subtract 1 - billion has six zeroes, trillion has nine, etc...)
Quote: weaselman
Nightfly and MKL: "Infinitely unlikely" and impossible means the same thing. There are no infinitely long streaks.
Nope, in probability theory an event that is impossible and an event which is possible but has a probability of 0 are not the same thing. That's why when a probability of an event approaches 1, mathematicians say that the event happens "almost surely" and not "surely".
Suppose you draw any random number from the set of all real numbers between 0 and 1. Since there are infinite number of real numbers between 0 and 1, every number has a probability of 0 to be picked but still one number ends up being picked, so this is one example of an event with probability 0 that happens.
I hope jfalk sees this thread. We've argued about this years ago, and never came to an agreement about it.
Quote: WizardMy argument that the Martingale doesn't work even with an infinite bankroll is inductive in nature. Let's take single-zero roulette and no maximum bet. For any finite bankroll you will eventually bust out. Let M(b) be the ratio of the expected loss to expected total amount bet for a bankroll of b. Simulations will show that M(b)=5.26% for b=1, 2, 3, ... googleplex. Since adding one more unit doesn't change the ratio, I claim that by induction we can say that M(b)=5.26% for any b, including infinity.
I hope jfalk sees this thread. We've argued about this years ago, and never came to an agreement about it.
Assuming 5.26% is the House Edge, would that not be for 00 Roulette, and 2.7% for single 0? or have I misunderstood?
Quote: Jufo81
Suppose you draw any random number from the set of all real numbers between 0 and 1. Since there are infinite number of real numbers between 0 and 1, every number has a probability of 0 to be picked but still one number ends up being picked, so this is one example of an event with probability 0 that happens.
Let's not get into non-countable sets theory. We'd need at least some calculus and advanced algebra toaccurately define the terms.
In the countable (discreet) case, "improbable" and "impossible" mean the same thing.
Speaking about "infinitely long streaks", there is an even simpler explanation, why they can never happen. You would have to wait an infinite amount of time for one to get registered. This is, again, equivalent to saying that it will never happen.
Quote: WizardMy argument that the Martingale doesn't work even with an infinite bankroll is inductive in nature. Let's take single-zero roulette and no maximum bet. For any finite bankroll you will eventually bust out. Let M(b) be the ratio of the expected loss to expected total amount bet for a bankroll of b. Simulations will show that M(b)=5.26% for b=1, 2, 3, ... googleplex. Since adding one more unit doesn't change the ratio, I claim that by induction we can say that M(b)=5.26% for any b, including infinity.
I hope jfalk sees this thread. We've argued about this years ago, and never came to an agreement about it.
The induction is applicable to the set of natural numbers. What you have actually proven here is that for any bankroll size that can be expressed as a natural number, martigale does not work. This says nothing about infinite bankrolls though, since infinity is not a natural number (strictly speaking, it's not a number at all, but some more concrete cardinals - like aleph-null for example - can be thought of as numbers in a certain extended sense of the word, but still, they are not naturals).
Surely the +1 is insignificant to the point that it does not exist when you add it to your infinate bankroll
+1 is only insignificant in the imaginary world where an infinite bankroll exists.Quote: WizardofEnglandBut if your bankroll is 'infinity', do you actually win by getting +1 to your bankroll?
Surely the +1 is insignificant to the point that it does not exist when you add it to your infinate bankroll
Quote: WizardofEnglandBut if your bankroll is 'infinity', do you actually win by getting +1 to your bankroll?
Surely the +1 is insignificant to the point that it does not exist when you add it to your infinate bankroll
This is a good question - how to define a "win". Perhaps, you can measure changes to the casino's bankroll (which can still be finite), and declare a "win" if it decreases.
Quote: DJTeddyBear+1 is only insignificant in the imaginary world where an infinite bankroll exists.
That's the one we are talking about though, isn't it?
Quote: weaselmanLet's not get into non-countable sets theory. We'd need at least some calculus and advanced algebra toaccurately define the terms.
In the countable (discreet) case, "improbable" and "impossible" mean the same thing.
Even in countable sets "improbable" (Probability of zero) and impossible are not the same thing. On roulette to get an infinite sequence of reds without ever getting black is "improbable" but still possible with probability zero. On the other hand spinning a yellow number is impossible.
Quote: Jufo81Even in countable sets "improbable" (Probability of zero) and impossible are not the same thing.
yes, it is.
Quote:On roulette to get an infinite sequence of reds without ever getting black is "improbable" but still possible with probability zero.
"Possible with probability zero" == "Impossible"
Quote:On the other hand spinning a yellow number is impossible.
If you mean that the yellow number is outside of the domain, yes, this is correct ... And applies equally to the "infinite streak" event - these two are equally impossible.
"Impossible" means, that something will definitely not happen within any finite amount of time, no matter how long you wait for it.
Quote: weaselman"Possible with probability zero" == "Impossible"
When working at infinity, the usual policy is to work with limit function. So there it is never actually zero, and there are multiple (infinitely so) "levels of infinity", beating one another. So it's well possible to arrive to a meaningful result even dividing infinity by infinity.
I think I have thrown too many people off by bringing gambling/roulette/making money into the picture.
This is the question:
If betting systems do not work then the following is true: If a coin is built that has a 50.0000000001% chance of landing on "Tails", there is a possibility that "Heads" will never show up in an infinite number of tosses. But with that same coin, it's impossible that "Tails" will never show up in an infinite number of tosses.
How does that .000000002% difference between the likelihood of events mean that one can occur infinitely, and the other cannot?
The odds of winning 50% of your bankroll before busting are under 66.6% no matter what you do.
The odds of winning 10% of your bankroll before busting are under 90% no matter what you do.
The lower percentage of your bankroll you are trying to win, the higher the odds of winning are but if you do it over and over again the odds get smaller. Using martingale until you double your bankroll has a smaller success chance than betting everything on one hand since the martingale exposes more money to the house edge. If your goal is to play once and leave then it might be ok but if you want a decent return on your investment you would have a higher chance of getting it by betting bigger units and doing less progression.
Quote: FinsRuleUnsurprisingly to everyone I'm sure, I am not convinced. If this was Yahoo! Answers, I think I would pick the Wizard's as being the best, but the inductive answer just doesn't "feel" right.
I think I have thrown too many people off by bringing gambling/roulette/making money into the picture.
This is the question:
If betting systems do not work then the following is true: If a coin is built that has a 50.0000000001% chance of landing on "Tails", there is a possibility that "Heads" will never show up in an infinite number of tosses. But with that same coin, it's impossible that "Tails" will never show up in an infinite number of tosses.
How does that .000000002% difference between the likelihood of events mean that one can occur infinitely, and the other cannot?
This answer is inaccurate. Both have a chance of never showing up in an infinite number of tosses; however the chance of tails never showing up is smaller.
Quote: P90When working at infinity, the usual policy is to work with limit function. So there it is never actually zero, and there are multiple (infinitely so) "levels of infinity", beating one another. So it's well possible to arrive to a meaningful result even dividing infinity by infinity.
Assuming that Mr. P90 knows what he is talking about - I actually think that this is the answer to my question. It has to be multiple levels of infinity. I'm going to have a tough time explaining this to my friends at the casino when they try to tell me that something is "due" to hit.
Quote: weaselmanyes, it is.
"Possible with probability zero" == "Impossible"
Like I said in probability theory these are not the same thing although for practical purposes it may seem so:
http://en.wikipedia.org/wiki/Almost_surely
Suppose you play a coin flipping game where you either win 1 unit or lose 1 unit and you have an advantage so that you win one unit with 51% probability and lose one unit with 49% probability. You can always bet only 1 unit and never increase the bet.
If you start with a bankroll of 10 units what is the probability that you never bust, including infinity?
Someone might say that in infinity you will bust for sure since you will always have some finite bankroll and given infinite amount of time you will sooner or later encounter a losing streak that will wipe you out. However, it turns out that the bust probability is not 100% but a smaller finite value including infinity.
Quote: CroupierAssuming 5.26% is the House Edge, would that not be for 00 Roulette, and 2.7% for single 0? or have I misunderstood?
D'oh! Of course I meant to say 2.70% for single-zero roulette. Thank you for the correction.
Quote: WizardD'oh! Of course I meant to say 2.70% for single-zero roulette. Thank you for the correction.
No problem. If anything I'm more surprised that I spotted it. I must have learned something. No wonder my head hurts.
Quote: weaselman"Possible with probability zero" == "Impossible"
Actually, no. Consider throwing a dart at any number between 0 and 10, including all irrational numbers. What is the probability of hitting pi? The answer is 1/infinity = 0, but it is still possible.
Quote: Jufo81Let's ask a different question, which also has to do with infinity.
Suppose you play a coin flipping game where you either win 1 unit or lose 1 unit and you have an advantage so that you win one unit with 51% probability and lose one unit with 49% probability. You can always bet only 1 unit and never increase the bet.
If you start with a bankroll of 10 units what is the probability that you never bust, including infinity?
Someone might say that in infinity you will bust for sure since you will always have some finite bankroll and given infinite amount of time you will sooner or later encounter a losing streak that will wipe you out. However, it turns out that the bust probability is not 100% but a smaller finite value including infinity.
Funny you should ask. I have this problem on my math problems site (number 154), only I use a probability of winning of 60%. In your case the probability of eventual ruin is (.49/.51)^10 = 67.03%.
An unfair game does not have infinite recurrence. There is always some positive probability that you wander down and never return to zero ever. (Again, doubling after losses is irrelevant here.)
The relevant theorems are in Karlin and Taylor, A First Course in Stochastic Processes (2nd ed.) in Chapter 6, but I won't try to summarize them here.
I hope this helps.... but short of giving a seminar in stochastic processes (which I am at best barely capable of doing) this is the central insight.
He is NOT implying an honest vs rigged game.
Quote: WizardActually, no. Consider throwing a dart at any number between 0 and 10, including all irrational numbers. What is the probability of hitting pi? The answer is 1/infinity = 0, but it is still possible.
Yes, that's what I said before - when talking about uncountable sets, things get complicated. But that's beyond the subject of this topic. As long as the set of possible events is countable, every event that can occur has a finite, non-zero probability, and vice versa - anything with a zero probability is impossible.
I think, the important question that's being overlooked is what exactly you mean when you are saying that "martigale does not work" with infinite bankroll. If by "work" you mean "guarantees a win (loss to the casino) of at least 1 unit at some point", then for martingale with infinite bankroll, it definitely is true (meaning, yes, it will "work" in this sense).
Quote: WizardLet me try to address jfalk's post. Let's say the Martingale player always bets on red. I think JF would say that the expected number of reds would be infinite.
You'd have to bet an infinite number of times to get an infinite number of reds, but that's outside the domain of applicable analysis. We can only pose a question of how much up or down the player will be after a finite (however large) number of bets. He will most certainly only see a finite number of reds in that case.
Quote:You would need to avoid them infinitely for the Martingale player to lose. The probability of never getting a red is "zero." However, I would argue it is a soft zero, that approaches as close as possible to zero, without actually getting there. Like the probability of hitting pi if you threw a dart between 0 and 10. It is that difference between a soft and hard zero that I claim makes the difference.
The probability of never getting a red after a finite (however large) number of spins is not zero, but a fixed non-zero number, getting arbitrarily close to zero as the number of spins grows. If you have infinite time and bankroll though, the probability of never seeing a red is however exactly zero.
Quote: weaselmanIf you have infinite time and bankroll though, the probability of never seeing a red is however exactly zero.
To argue that is the point of this thread. I claim otherwise.
Quote: WizardTo argue that is the point of this thread. I claim otherwise.
Will you also argue that .99999999... (with the 9 repeating infinitely) does not equal 1?
Quote: weaselmanYes, that's what I said before - when talking about uncountable sets, things get complicated. But that's beyond the subject of this topic. As long as the set of possible events is countable, every event that can occur has a finite, non-zero probability, and vice versa - anything with a zero probability is impossible.
You forget the group of countable but infinite sets. For example the set of positive integers N = 1,2,3,... is countable and infinite. So countablitity doesn't mean finite.
Quote: WizardTo argue that is the point of this thread. I claim otherwise.
Let's try this again. The basic recurrence theorem is than in a fair game (and DJTeddybear is right about what I mean by fair, ie, p=0.5) every node is infinitely recurrent. Thus, pick any finite point: up $10,000; down $15,236; up $1.869 billion. The expected number of times one reaches that point with an infinite bankroll and an infinite number of plays is infinite. The only rule is that you have to be able to reach that point under the strategy you play. Thus, with $1 bets, you can never reach -$887.50. Playing $1 and doubling on losses allows you to reach any integer number. To see this, realize that you can obviously have any positive number, and from any positive number you can always reach any negative number by doubling down enough times. One way to reach -$9, for example, is to be up $22 and then lose five times in a row (+21,+19,+15,+7,-9). But that means every possible finite level of bankroll is infinitely recurrent. Since one of those points is 0, that means that if you keep playing, you will hit zero. And you will hit it an infinite number of times. It is in this sense that a martingale strategy can't win. But it really can't lose in a fair game either. The strategy doesn't make any difference. Note you can have a strategy in which zero is not infinitely recurrent. Bet $1 on the first round and then bet $2 every round thereafter. You'll never hit zero, but both +$1 and -$1 will be infinitely recurrent, as will every other odd bankroll.
All of this goes away if p<0.5, since the recurrence relationship goes away.
No. To use the Wiz' terminology, it's a soft 1.Quote: cellardoorWill you also argue that .99999999... (with the 9 repeating infinitely) does not equal 1?
The difference between soft and hard numbers is due to rounding.
Quote: cellardoorWill you also argue that .99999999... (with the 9 repeating infinitely) does not equal 1?
As DJ wrote, I would call that a "soft 1." I would equate that to the probability of throwing a dart between 0 and 10, and not hitting pi.
Quote: jfalkI hate to disagreee with you, but there's a simple proof that they're the same thing: Let x=.999999... Let z=10x=9.9999999.... Then z-x=9 = 10x-x = 9x. So x=1.
Without addressing that, if I may play the offense for a moment, what would you say is the probability of throwing a dart between 0 and 10 and not hitting pi?
Quote: DJTeddyBear
Hell, if you start with a $1 bet, you only need 50 losses before the bet is so big, that the average person can't say it, because they (like me) do not know what comes after "trillion": $1,125,899,906,842,624.
50 losses, while unlikely, is not unrealistic.
Quadrillion I believe.
Quote: WizardLet me try to address jfalk's post. Let's say the Martingale player always bets on red. I think JF would say that the expected number of reds would be infinite. You would need to avoid them infinitely for the Martingale player to lose. The probability of never getting a red is "zero." However, I would argue it is a soft zero, that approaches as close as possible to zero, without actually getting there. Like the probability of hitting pi if you threw a dart between 0 and 10. It is that difference between a soft and hard zero that I claim makes the difference.
I agree with you here. You can approach a number but never truly reach it. It's like saying 1 divided by infinity. Well, that doesn't equal -0-, but it approaches zero. And the slight subtle distinction can't be treated equivalently.
You could bet red indefinitely, but the outcome could be black indefinitely. The probability of an occurence should not be confused with an actual occurrence, even if you ram it to infinity.
It has been a long time since I have studied these things. I think I recall the discussion about whether .99999... equals 1.0 leads to the conclusion that it approaches 1.0 as a limit as the number of decimals increases without limit. However, you must actually reach infinity (not considered possible) for them to be identically equal.Quote: WizardWithout addressing that, if I may play the offense for a moment, what would you say is the probability of throwing a dart between 0 and 10 and not hitting pi?Quote: jfalkI hate to disagreee with you, but there's a simple proof that they're the same thing: Let x=.999999... Let z=10x=9.9999999.... Then z-x=9 = 10x-x = 9x. So x=1.
As for the z=10x argument, I think that going from z=9.9999... to z-x=9 disregards whatever happens as you approach infinity (how many total digits are in z and in x?) and essentially assumes what is to be proved.
I agree with the Wizard's "dart hitting pi" example. However, it is more complex than some of the other examples, since the irrational numbers are uncountably infinite. I think I prefer the example of positive integers mentioned before. That is an infinite set; what is the probability that a randomly chosen positive integer is equal to 21? I think the probability is 1/infinity, or as close to zero as you could perceive it to be, even though a random choice could possibly result in 21.
Edit: Looks as if my points were made while I was typing.
Quote: Doc
I agree with the Wizard's "dart hitting pi" example. However, it is more complex than some of the other examples, since the irrational numbers are uncountably infinite. I think I prefer the example of positive integers mentioned before. That is an infinite set; what is the probability that a randomly chosen positive integer is equal to 21? I think the probability is 1/infinity, or as close to zero as you could perceive it to be, even though a random choice could possibly result in 21.
This is true but the problem is: who do you choose an integer from all positive integers so that each one is equally likely to be picked (ie. each is picked with probability zero). What would the picking process be?
Quote: WizardTo argue that is the point of this thread. I claim otherwise.
Well, mathematically, it is limitN->0(p^N), which is 0. I don't see very many ways to look at this.
If math isn't convincing, perhaps, you could try running a simulation to see how many times in 10 billion (or whatever) a player with unlimited bankroll will lose? :)
int iter=0,win=0,br=0,bet=1;
while(1)
{
iter++;
while(br <= 0) { br += bet * (rand() < 0.45 ? -1 : 1); bet *=2; }
win++;
printf("%d Iterations. Win rate: %6.2d%%\n", iter, 100.0*win/iter);
}
I understand that you have some programming experience. Can you guess what the output of this program will be?
Do you believe it might change at some point if you let it run for very long time?
Quote: weaselmanWell, mathematically, it is limitN->0(p^N), which is 0. I don't see very many ways to look at this.
If math isn't convincing, perhaps, you could try running a simulation to see how many times in 10 billion (or whatever) a player with unlimited bankroll will lose? :)
int iter=0,win=0,br=0,bet=1;
while(1)
{
iter++;
while(br <= 0) { br += bet * (rand() < 0.45 ? -1 : 1); bet *=2; }
win++;
printf("%d Iterations. Win rate: %6.2d%%\n", iter, 100.0*win/iter);
}
I understand that you have some programming experience. Can you guess what the output of this program will be?
Do you believe it might change at some point if you let it run for very long time?
Disagree with using a 10 billion (finite sample) to measure the number of times a player with an infinite bankroll will lose. The probability of still losing in this case is (1/2)^10 billion, massive jaw numbing number. But it's there, however small.
Will the Wizard opine on the following which is related, but a corollary of this discussion:
What is the impact of infinity on the following types of games:
1.) Positive Expectations - win
2.) Neutral -0- Expectations - ??????
3.) Negative Expectations - lose
Quote: jfalkI would say it is an event of measure one. That means that the event of hitting pi is an event of measure zero. Measure and probability are not quite identical though.
Likewise, I would say the "measure" of the chances of ever getting a red in an infinite roulette game is 1. However, the probability is not 1. If the probability is not a true 1 then the infinite Martingale player might lose.
Quote: WizardLikewise, I would say the "measure" of the chances of ever getting a red in an infinite roulette game is 1. However, the probability is not 1. If the probability is not a true 1 then the infinite Martingale player might lose.
You do realize, that the game with this hypothetical infinitely long streak would never end, right? If the game is still in progress, how can you conclude that the player lost?
Quote: Asswhoopermcdaddy
Disagree with using a 10 billion (finite sample) to measure the number of times a player with an infinite bankroll will lose. The probability of still losing in this case is (1/2)^10 billion, massive jaw numbing number. But it's there, however small.
No, 1/2^10 billion is the probability that a single iteration (inner loop in my code snipped) will take that long to complete (if this was a fair game), but that's not what we are taking about.
The probability we are looking for is that the program will print out anything other than 100.00% after however long you want to let it run (an hour, a year, a century, the entire age of the universe, the number of years, equal to the age of universe measured in nanoseconds ... etc., etc.). The probability of this event is exactly 0. You see, unlike the actual game, the program is deterministic - if you read the code carefully, you'll see that there is simply no way for it to print out anything other than 100.00%.
This is another way to prove, what is easily seen from that limN->inf(p^N)=0 equation I quoted earlier.
Frankly, I simply don't see any ground for disagreement here, so, I guess, I am just goint to have to shut up at this point, because I don't really have anything to add.
.Quote: WizardWithout addressing that, if I may play the offense for a moment, what would you say is the probability of throwing a dart between 0 and 10 and not hitting pi?
Actually, according to Heisenburg's Uncertainty principle, the dart is much bigger than the size pi and takes up a finite space between 0 and ten, and therefore, the probability of hitting pi could be measured. If the dart was extremely small, then measuring the position of the dart affects the actual measurement itself and therefore, you could only say with a percentage of certainty that the dart hit pi.
Fun, Wow!
But we're talking about a wheel and numbers. I posit that over an infinite amount of spins, the distribution must approach the theoretical probabilities according to the real odds. That is, if you have an infinite bankroll, and an infinite amount of spins, betting flat, you will lose exactly 2/38th of your money. The central limit theorem would predict this.
In any case, infinity really means nothing. Everything in this experiment is finite and measurable.
Quote: weaselmanIf the game is still in progress, how can you conclude that the player lost?
Well, without reading the rest of the thread, that makes sense to me. Raising the bet is an open ended process, and the game never ends. However if the player hits his mark he's back to start over or quit.
Now I should read the rest of the thread. Hah.