FinsRule
FinsRule
  • Threads: 129
  • Posts: 3945
Joined: Dec 23, 2009
January 23rd, 2011 at 11:04:39 PM permalink
Quote from Wizard: "... I claim that even with an infinite amount of money that the Martingale does not guarantee a profit. Let's make a separate thread if anyone wishes to take this further. However, I warn you that it will boil down to paradoxical nature of infinity."

Ok, I'm not great at math. The Wizard says that a betting system cannot change the house edge, so that is why the Martingale cannot work. Meaning if the game is positive EV, the Martingale cannot fail.

Here is the question that pretty much sums up why I have such trouble with the Martingale:

Roulette Wheel 1: 37 numbers - 0 - 36.
Roulette Wheel 2: 35 numbers - 1 - 35.

According to Wizard - If I Martingale on Roulette Wheel 1 with an infinite bankroll, betting 1-18 getting paid even money, I cannot win because of the house edge, and the fact that there can be an infinite number of 19-36 or 0.

However, if I Martingale on Roulette Wheel 2 with an infinite bankroll, betting 1-18 getting paid even money, I cannot lose, because numbers 19-35 cannot occur infinitely.

I hope this doesn't make me sound stupid, but how is this possible? I'll even take an answer from MKL.

It's late, and I'm sure when I wake up and see all the comments, I'll see how stupid everyone thinks I am. Oh well.
TheNightfly
TheNightfly
  • Threads: 23
  • Posts: 480
Joined: May 21, 2010
January 23rd, 2011 at 11:27:11 PM permalink
The simple answer is that however unlikely it may be, it is possible for the numbers 19-35 to occur for an infinite period of time. It is possible for the same number to repeat an infinite number of times. It is possible for black numbers to be rolled consecutively an infinte number of times.

The point is that no matter what you are betting, it is possible for that section or those numbers NOT to hit for an infinite period of time. Infinitely unlikely, but possible. Mathematically, in a +EV game you will be ahead in the "long run" and in a -EV game you will be behind in the "long run" but in real life it is impossible to correctly predict an outcome.

I don't know if I've given you the answer you were looking for but that's the simplest explanation I can come up with. When you casually throw the concept of infinity into a math problem you usually end up with some pretty strange results and they don't always make sense.
Happiness is underrated
mkl654321
mkl654321
  • Threads: 65
  • Posts: 3412
Joined: Aug 8, 2010
January 24th, 2011 at 12:23:11 AM permalink
Whatever the parameters for the Martingale, there is an X number of losses that will kill it. With a finite bankroll and/or a finite house limit, that number is finite, no matter how large it might be, and eventually a losing streak of X or greater will occur.

If X is infinite, then an infinite number of losses will still occur at some time (over the course of infinity). Or to look at it another way, given an infinite set of results, some of those results will be a perfect losing record--enough to kill an infinite Martingale.

And thank you so much for accepting "even" my response :)
The fact that a believer is happier than a skeptic is no more to the point than the fact that a drunken man is happier than a sober one. The happiness of credulity is a cheap and dangerous quality.---George Bernard Shaw
HKrandom
HKrandom
  • Threads: 18
  • Posts: 130
Joined: Oct 1, 2010
January 24th, 2011 at 4:32:51 AM permalink
Even if it did work, why would you want to risk an infinite amount of money to gain one unit? Your bankroll would only increase by (100/infinity)% every times you won; even if the odds were fair you would still be better off putting your money at the bank...
DJTeddyBear
DJTeddyBear
  • Threads: 210
  • Posts: 11062
Joined: Nov 2, 2009
January 24th, 2011 at 4:54:44 AM permalink
"Cannot occur infinately" ? Define "infinite".


I'll grant you that with an infinite bankroll and no table limit, there is an extremely small chance that you'll die before finally breaking even.

But that chance does exist.


If you like, you can consider a much smaller number of spins. While the chance is still extremely small, it is not unlikely that you'll Martingale yourself up to the point where you're betting more money than exists in the world.

Hell, if you start with a $1 bet, you only need 50 losses before the bet is so big, that the average person can't say it, because they (like me) do not know what comes after "trillion": $1,125,899,906,842,624.


50 losses, while unlikely, is not unrealistic.
I invented a few casino games. Info: http://www.DaveMillerGaming.com/ ————————————————————————————————————— Superstitions are silly, childish, irrational rituals, born out of fear of the unknown. But how much does it cost to knock on wood? 😁
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 5:09:41 AM permalink
I don't know what paradoxes Wizard had in mind, but the answer is really pretty simple. Every now and then you get a strike, that wipes out your winnings. On average, in an even money game the streak happens just often enough to exactly cancel all that you win in between, if a game is -EV, then your strikes will cost you on average all your winnings minus the EV, and with +EV it is the other way around.
Bottom line is, whatever betting system you use, you cannot change the house edge. The size of your bankroll - infinite or not - affects only how soon and how badly you will lose, not the mere fact that you will.

(Edit: I have changed my point of view after giving this some more thought, and clarifying the exact terms of the problem. In a process, defined so that the player keeps doubling his bet as long as necessary until a given event occurs, and then stops immediately, there is only one possible outcome - a win of 1 unit by the player, and therefore an EV of such process is +1).

Nightfly and MKL: "Infinitely unlikely" and impossible means the same thing. There are no infinitely long streaks.
DJTeddyBear: After the trillion comes the quadrillion :) (seriously - four after three etc., you just divide the number of zeroes by three and subtract 1 - billion has six zeroes, trillion has nine, etc...)
"When two people always agree one of them is unnecessary"
Jufo81
Jufo81
  • Threads: 6
  • Posts: 344
Joined: May 23, 2010
January 24th, 2011 at 5:32:12 AM permalink
Quote: weaselman


Nightfly and MKL: "Infinitely unlikely" and impossible means the same thing. There are no infinitely long streaks.



Nope, in probability theory an event that is impossible and an event which is possible but has a probability of 0 are not the same thing. That's why when a probability of an event approaches 1, mathematicians say that the event happens "almost surely" and not "surely".

Suppose you draw any random number from the set of all real numbers between 0 and 1. Since there are infinite number of real numbers between 0 and 1, every number has a probability of 0 to be picked but still one number ends up being picked, so this is one example of an event with probability 0 that happens.
Wizard
Administrator
Wizard
  • Threads: 1520
  • Posts: 27120
Joined: Oct 14, 2009
January 24th, 2011 at 5:40:09 AM permalink
My argument that the Martingale doesn't work even with an infinite bankroll is inductive in nature. Let's take single-zero roulette and no maximum bet. For any finite bankroll you will eventually bust out. Let M(b) be the ratio of the expected loss to expected total amount bet for a bankroll of b. Simulations will show that M(b)=2.70% for b=1, 2, 3, ... googleplex. Since adding one more unit doesn't change the ratio, I claim that by induction we can say that M(b)=2.70% for any b, including infinity.

I hope jfalk sees this thread. We've argued about this years ago, and never came to an agreement about it.
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
Croupier
Croupier
  • Threads: 58
  • Posts: 1258
Joined: Nov 15, 2009
January 24th, 2011 at 5:50:35 AM permalink
Quote: Wizard

My argument that the Martingale doesn't work even with an infinite bankroll is inductive in nature. Let's take single-zero roulette and no maximum bet. For any finite bankroll you will eventually bust out. Let M(b) be the ratio of the expected loss to expected total amount bet for a bankroll of b. Simulations will show that M(b)=5.26% for b=1, 2, 3, ... googleplex. Since adding one more unit doesn't change the ratio, I claim that by induction we can say that M(b)=5.26% for any b, including infinity.

I hope jfalk sees this thread. We've argued about this years ago, and never came to an agreement about it.



Assuming 5.26% is the House Edge, would that not be for 00 Roulette, and 2.7% for single 0? or have I misunderstood?
[This space is intentionally left blank]
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 6:51:16 AM permalink
Quote: Jufo81


Suppose you draw any random number from the set of all real numbers between 0 and 1. Since there are infinite number of real numbers between 0 and 1, every number has a probability of 0 to be picked but still one number ends up being picked, so this is one example of an event with probability 0 that happens.



Let's not get into non-countable sets theory. We'd need at least some calculus and advanced algebra toaccurately define the terms.
In the countable (discreet) case, "improbable" and "impossible" mean the same thing.

Speaking about "infinitely long streaks", there is an even simpler explanation, why they can never happen. You would have to wait an infinite amount of time for one to get registered. This is, again, equivalent to saying that it will never happen.
"When two people always agree one of them is unnecessary"
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 6:56:39 AM permalink
Quote: Wizard

My argument that the Martingale doesn't work even with an infinite bankroll is inductive in nature. Let's take single-zero roulette and no maximum bet. For any finite bankroll you will eventually bust out. Let M(b) be the ratio of the expected loss to expected total amount bet for a bankroll of b. Simulations will show that M(b)=5.26% for b=1, 2, 3, ... googleplex. Since adding one more unit doesn't change the ratio, I claim that by induction we can say that M(b)=5.26% for any b, including infinity.

I hope jfalk sees this thread. We've argued about this years ago, and never came to an agreement about it.



The induction is applicable to the set of natural numbers. What you have actually proven here is that for any bankroll size that can be expressed as a natural number, martigale does not work. This says nothing about infinite bankrolls though, since infinity is not a natural number (strictly speaking, it's not a number at all, but some more concrete cardinals - like aleph-null for example - can be thought of as numbers in a certain extended sense of the word, but still, they are not naturals).
"When two people always agree one of them is unnecessary"
WizardofEngland
WizardofEngland
  • Threads: 61
  • Posts: 638
Joined: Nov 2, 2010
January 24th, 2011 at 7:06:37 AM permalink
But if your bankroll is 'infinity', do you actually win by getting +1 to your bankroll?

Surely the +1 is insignificant to the point that it does not exist when you add it to your infinate bankroll
http://wizardofvegas.com/forum/off-topic/general/10042-woes-black-sheep-game-ii/#post151727
DJTeddyBear
DJTeddyBear
  • Threads: 210
  • Posts: 11062
Joined: Nov 2, 2009
January 24th, 2011 at 7:30:16 AM permalink
Quote: WizardofEngland

But if your bankroll is 'infinity', do you actually win by getting +1 to your bankroll?

Surely the +1 is insignificant to the point that it does not exist when you add it to your infinate bankroll

+1 is only insignificant in the imaginary world where an infinite bankroll exists.
I invented a few casino games. Info: http://www.DaveMillerGaming.com/ ————————————————————————————————————— Superstitions are silly, childish, irrational rituals, born out of fear of the unknown. But how much does it cost to knock on wood? 😁
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 7:32:08 AM permalink
Quote: WizardofEngland

But if your bankroll is 'infinity', do you actually win by getting +1 to your bankroll?

Surely the +1 is insignificant to the point that it does not exist when you add it to your infinate bankroll



This is a good question - how to define a "win". Perhaps, you can measure changes to the casino's bankroll (which can still be finite), and declare a "win" if it decreases.

Quote: DJTeddyBear

+1 is only insignificant in the imaginary world where an infinite bankroll exists.



That's the one we are talking about though, isn't it?
"When two people always agree one of them is unnecessary"
Jufo81
Jufo81
  • Threads: 6
  • Posts: 344
Joined: May 23, 2010
January 24th, 2011 at 7:41:19 AM permalink
Quote: weaselman

Let's not get into non-countable sets theory. We'd need at least some calculus and advanced algebra toaccurately define the terms.
In the countable (discreet) case, "improbable" and "impossible" mean the same thing.



Even in countable sets "improbable" (Probability of zero) and impossible are not the same thing. On roulette to get an infinite sequence of reds without ever getting black is "improbable" but still possible with probability zero. On the other hand spinning a yellow number is impossible.
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 7:49:17 AM permalink
Quote: Jufo81

Even in countable sets "improbable" (Probability of zero) and impossible are not the same thing.



yes, it is.

Quote:

On roulette to get an infinite sequence of reds without ever getting black is "improbable" but still possible with probability zero.



"Possible with probability zero" == "Impossible"

Quote:

On the other hand spinning a yellow number is impossible.


If you mean that the yellow number is outside of the domain, yes, this is correct ... And applies equally to the "infinite streak" event - these two are equally impossible.

"Impossible" means, that something will definitely not happen within any finite amount of time, no matter how long you wait for it.
"When two people always agree one of them is unnecessary"
P90
P90
  • Threads: 12
  • Posts: 1703
Joined: Jan 8, 2011
January 24th, 2011 at 7:56:23 AM permalink
Quote: weaselman

"Possible with probability zero" == "Impossible"


When working at infinity, the usual policy is to work with limit function. So there it is never actually zero, and there are multiple (infinitely so) "levels of infinity", beating one another. So it's well possible to arrive to a meaningful result even dividing infinity by infinity.
Resist ANFO Boston PRISM Stormfront IRA Freedom CIA Obama
FinsRule
FinsRule
  • Threads: 129
  • Posts: 3945
Joined: Dec 23, 2009
January 24th, 2011 at 7:56:33 AM permalink
Unsurprisingly to everyone I'm sure, I am not convinced. If this was Yahoo! Answers, I think I would pick the Wizard's as being the best, but the inductive answer just doesn't "feel" right.

I think I have thrown too many people off by bringing gambling/roulette/making money into the picture.

This is the question:

If betting systems do not work then the following is true: If a coin is built that has a 50.0000000001% chance of landing on "Tails", there is a possibility that "Heads" will never show up in an infinite number of tosses. But with that same coin, it's impossible that "Tails" will never show up in an infinite number of tosses.

How does that .000000002% difference between the likelihood of events mean that one can occur infinitely, and the other cannot?
HKrandom
HKrandom
  • Threads: 18
  • Posts: 130
Joined: Oct 1, 2010
January 24th, 2011 at 7:56:45 AM permalink
The odds of winning 100% of your bankroll (therefore doubling it) before busting are a under 50% no matter what you do.
The odds of winning 50% of your bankroll before busting are under 66.6% no matter what you do.
The odds of winning 10% of your bankroll before busting are under 90% no matter what you do.
The lower percentage of your bankroll you are trying to win, the higher the odds of winning are but if you do it over and over again the odds get smaller. Using martingale until you double your bankroll has a smaller success chance than betting everything on one hand since the martingale exposes more money to the house edge. If your goal is to play once and leave then it might be ok but if you want a decent return on your investment you would have a higher chance of getting it by betting bigger units and doing less progression.
HKrandom
HKrandom
  • Threads: 18
  • Posts: 130
Joined: Oct 1, 2010
January 24th, 2011 at 7:58:27 AM permalink
Quote: FinsRule

Unsurprisingly to everyone I'm sure, I am not convinced. If this was Yahoo! Answers, I think I would pick the Wizard's as being the best, but the inductive answer just doesn't "feel" right.

I think I have thrown too many people off by bringing gambling/roulette/making money into the picture.

This is the question:

If betting systems do not work then the following is true: If a coin is built that has a 50.0000000001% chance of landing on "Tails", there is a possibility that "Heads" will never show up in an infinite number of tosses. But with that same coin, it's impossible that "Tails" will never show up in an infinite number of tosses.

How does that .000000002% difference between the likelihood of events mean that one can occur infinitely, and the other cannot?



This answer is inaccurate. Both have a chance of never showing up in an infinite number of tosses; however the chance of tails never showing up is smaller.
FinsRule
FinsRule
  • Threads: 129
  • Posts: 3945
Joined: Dec 23, 2009
January 24th, 2011 at 8:02:53 AM permalink
Quote: P90

When working at infinity, the usual policy is to work with limit function. So there it is never actually zero, and there are multiple (infinitely so) "levels of infinity", beating one another. So it's well possible to arrive to a meaningful result even dividing infinity by infinity.



Assuming that Mr. P90 knows what he is talking about - I actually think that this is the answer to my question. It has to be multiple levels of infinity. I'm going to have a tough time explaining this to my friends at the casino when they try to tell me that something is "due" to hit.
Jufo81
Jufo81
  • Threads: 6
  • Posts: 344
Joined: May 23, 2010
January 24th, 2011 at 8:05:00 AM permalink
Quote: weaselman

yes, it is.
"Possible with probability zero" == "Impossible"



Like I said in probability theory these are not the same thing although for practical purposes it may seem so:

http://en.wikipedia.org/wiki/Almost_surely
Jufo81
Jufo81
  • Threads: 6
  • Posts: 344
Joined: May 23, 2010
January 24th, 2011 at 8:10:28 AM permalink
Let's ask a different question, which also has to do with infinity.

Suppose you play a coin flipping game where you either win 1 unit or lose 1 unit and you have an advantage so that you win one unit with 51% probability and lose one unit with 49% probability. You can always bet only 1 unit and never increase the bet.

If you start with a bankroll of 10 units what is the probability that you never bust, including infinity?

Someone might say that in infinity you will bust for sure since you will always have some finite bankroll and given infinite amount of time you will sooner or later encounter a losing streak that will wipe you out. However, it turns out that the bust probability is not 100% but a smaller finite value including infinity.
Wizard
Administrator
Wizard
  • Threads: 1520
  • Posts: 27120
Joined: Oct 14, 2009
January 24th, 2011 at 8:55:41 AM permalink
Quote: Croupier

Assuming 5.26% is the House Edge, would that not be for 00 Roulette, and 2.7% for single 0? or have I misunderstood?



D'oh! Of course I meant to say 2.70% for single-zero roulette. Thank you for the correction.
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
Croupier
Croupier
  • Threads: 58
  • Posts: 1258
Joined: Nov 15, 2009
January 24th, 2011 at 8:58:52 AM permalink
Quote: Wizard

D'oh! Of course I meant to say 2.70% for single-zero roulette. Thank you for the correction.



No problem. If anything I'm more surprised that I spotted it. I must have learned something. No wonder my head hurts.
[This space is intentionally left blank]
Wizard
Administrator
Wizard
  • Threads: 1520
  • Posts: 27120
Joined: Oct 14, 2009
January 24th, 2011 at 9:11:11 AM permalink
Quote: weaselman

"Possible with probability zero" == "Impossible"



Actually, no. Consider throwing a dart at any number between 0 and 10, including all irrational numbers. What is the probability of hitting pi? The answer is 1/infinity = 0, but it is still possible.





Quote: Jufo81

Let's ask a different question, which also has to do with infinity.

Suppose you play a coin flipping game where you either win 1 unit or lose 1 unit and you have an advantage so that you win one unit with 51% probability and lose one unit with 49% probability. You can always bet only 1 unit and never increase the bet.

If you start with a bankroll of 10 units what is the probability that you never bust, including infinity?

Someone might say that in infinity you will bust for sure since you will always have some finite bankroll and given infinite amount of time you will sooner or later encounter a losing streak that will wipe you out. However, it turns out that the bust probability is not 100% but a smaller finite value including infinity.



Funny you should ask. I have this problem on my math problems site (number 154), only I use a probability of winning of 60%. In your case the probability of eventual ruin is (.49/.51)^10 = 67.03%.
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
jfalk
jfalk
  • Threads: 2
  • Posts: 29
Joined: Sep 2, 2010
January 24th, 2011 at 10:23:37 AM permalink
OK... When the Wizard calls, I try to answer. The basic answer to the underlying question is as follows: a fair game has infinite recurrence. What that means is that with an infinite bankroll playing a fair game, one is guaranteed to hit zero an infinite number of times. This result is not really affected whether you double after losses or not, but the math to show the latter is a little trickier. So while your bankroll can wander as high as you care to name or as low as you care to name, eventually, you are always expected to come back to zero, at which point you're back where you started. What's confusing about this is that the expectation in a fair game is always equal to wherever you are, so once you are up 1 unit, your expectation is that you will stay up one unit and the state "up one unit" will also infinitely recur.

An unfair game does not have infinite recurrence. There is always some positive probability that you wander down and never return to zero ever. (Again, doubling after losses is irrelevant here.)

The relevant theorems are in Karlin and Taylor, A First Course in Stochastic Processes (2nd ed.) in Chapter 6, but I won't try to summarize them here.

I hope this helps.... but short of giving a seminar in stochastic processes (which I am at best barely capable of doing) this is the central insight.
DJTeddyBear
DJTeddyBear
  • Threads: 210
  • Posts: 11062
Joined: Nov 2, 2009
January 24th, 2011 at 11:13:06 AM permalink
For what it's worth, I'm fairly certain that jfalk's use of the terms "Fair" and "Unfair" is shorthand for a zero house edge game vs a game where the house has an edge.

He is NOT implying an honest vs rigged game.
I invented a few casino games. Info: http://www.DaveMillerGaming.com/ ————————————————————————————————————— Superstitions are silly, childish, irrational rituals, born out of fear of the unknown. But how much does it cost to knock on wood? 😁
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 11:22:19 AM permalink
Quote: Wizard

Actually, no. Consider throwing a dart at any number between 0 and 10, including all irrational numbers. What is the probability of hitting pi? The answer is 1/infinity = 0, but it is still possible.



Yes, that's what I said before - when talking about uncountable sets, things get complicated. But that's beyond the subject of this topic. As long as the set of possible events is countable, every event that can occur has a finite, non-zero probability, and vice versa - anything with a zero probability is impossible.

I think, the important question that's being overlooked is what exactly you mean when you are saying that "martigale does not work" with infinite bankroll. If by "work" you mean "guarantees a win (loss to the casino) of at least 1 unit at some point", then for martingale with infinite bankroll, it definitely is true (meaning, yes, it will "work" in this sense).
"When two people always agree one of them is unnecessary"
Wizard
Administrator
Wizard
  • Threads: 1520
  • Posts: 27120
Joined: Oct 14, 2009
January 24th, 2011 at 11:22:55 AM permalink
Let me try to address jfalk's post. Let's say the Martingale player always bets on red. I think JF would say that the expected number of reds would be infinite. You would need to avoid them infinitely for the Martingale player to lose. The probability of never getting a red is "zero." However, I would argue it is a soft zero, that approaches as close as possible to zero, without actually getting there. Like the probability of hitting pi if you threw a dart between 0 and 10. It is that difference between a soft and hard zero that I claim makes the difference.
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 11:35:43 AM permalink
Quote: Wizard

Let me try to address jfalk's post. Let's say the Martingale player always bets on red. I think JF would say that the expected number of reds would be infinite.



You'd have to bet an infinite number of times to get an infinite number of reds, but that's outside the domain of applicable analysis. We can only pose a question of how much up or down the player will be after a finite (however large) number of bets. He will most certainly only see a finite number of reds in that case.


Quote:

You would need to avoid them infinitely for the Martingale player to lose. The probability of never getting a red is "zero." However, I would argue it is a soft zero, that approaches as close as possible to zero, without actually getting there. Like the probability of hitting pi if you threw a dart between 0 and 10. It is that difference between a soft and hard zero that I claim makes the difference.


The probability of never getting a red after a finite (however large) number of spins is not zero, but a fixed non-zero number, getting arbitrarily close to zero as the number of spins grows. If you have infinite time and bankroll though, the probability of never seeing a red is however exactly zero.
"When two people always agree one of them is unnecessary"
Wizard
Administrator
Wizard
  • Threads: 1520
  • Posts: 27120
Joined: Oct 14, 2009
January 24th, 2011 at 11:40:10 AM permalink
Quote: weaselman

If you have infinite time and bankroll though, the probability of never seeing a red is however exactly zero.



To argue that is the point of this thread. I claim otherwise.
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
cellardoor
cellardoor
  • Threads: 1
  • Posts: 70
Joined: Jan 4, 2011
January 24th, 2011 at 11:45:05 AM permalink
Quote: Wizard

To argue that is the point of this thread. I claim otherwise.



Will you also argue that .99999999... (with the 9 repeating infinitely) does not equal 1?
Jufo81
Jufo81
  • Threads: 6
  • Posts: 344
Joined: May 23, 2010
January 24th, 2011 at 11:59:05 AM permalink
Quote: weaselman

Yes, that's what I said before - when talking about uncountable sets, things get complicated. But that's beyond the subject of this topic. As long as the set of possible events is countable, every event that can occur has a finite, non-zero probability, and vice versa - anything with a zero probability is impossible.



You forget the group of countable but infinite sets. For example the set of positive integers N = 1,2,3,... is countable and infinite. So countablitity doesn't mean finite.
jfalk
jfalk
  • Threads: 2
  • Posts: 29
Joined: Sep 2, 2010
January 24th, 2011 at 12:00:06 PM permalink
Quote: Wizard

To argue that is the point of this thread. I claim otherwise.



Let's try this again. The basic recurrence theorem is than in a fair game (and DJTeddybear is right about what I mean by fair, ie, p=0.5) every node is infinitely recurrent. Thus, pick any finite point: up $10,000; down $15,236; up $1.869 billion. The expected number of times one reaches that point with an infinite bankroll and an infinite number of plays is infinite. The only rule is that you have to be able to reach that point under the strategy you play. Thus, with $1 bets, you can never reach -$887.50. Playing $1 and doubling on losses allows you to reach any integer number. To see this, realize that you can obviously have any positive number, and from any positive number you can always reach any negative number by doubling down enough times. One way to reach -$9, for example, is to be up $22 and then lose five times in a row (+21,+19,+15,+7,-9). But that means every possible finite level of bankroll is infinitely recurrent. Since one of those points is 0, that means that if you keep playing, you will hit zero. And you will hit it an infinite number of times. It is in this sense that a martingale strategy can't win. But it really can't lose in a fair game either. The strategy doesn't make any difference. Note you can have a strategy in which zero is not infinitely recurrent. Bet $1 on the first round and then bet $2 every round thereafter. You'll never hit zero, but both +$1 and -$1 will be infinitely recurrent, as will every other odd bankroll.

All of this goes away if p<0.5, since the recurrence relationship goes away.
DJTeddyBear
DJTeddyBear
  • Threads: 210
  • Posts: 11062
Joined: Nov 2, 2009
January 24th, 2011 at 12:21:51 PM permalink
Quote: cellardoor

Will you also argue that .99999999... (with the 9 repeating infinitely) does not equal 1?

No. To use the Wiz' terminology, it's a soft 1.

The difference between soft and hard numbers is due to rounding.
I invented a few casino games. Info: http://www.DaveMillerGaming.com/ ————————————————————————————————————— Superstitions are silly, childish, irrational rituals, born out of fear of the unknown. But how much does it cost to knock on wood? 😁
Wizard
Administrator
Wizard
  • Threads: 1520
  • Posts: 27120
Joined: Oct 14, 2009
January 24th, 2011 at 12:28:51 PM permalink
Quote: cellardoor

Will you also argue that .99999999... (with the 9 repeating infinitely) does not equal 1?



As DJ wrote, I would call that a "soft 1." I would equate that to the probability of throwing a dart between 0 and 10, and not hitting pi.
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
jfalk
jfalk
  • Threads: 2
  • Posts: 29
Joined: Sep 2, 2010
January 24th, 2011 at 12:30:53 PM permalink
I hate to disagreee with you, but there's a simple proof that they're the same thing: Let x=.999999... Let z=10x=9.9999999.... Then z-x=9 = 10x-x = 9x. So x=1.
Wizard
Administrator
Wizard
  • Threads: 1520
  • Posts: 27120
Joined: Oct 14, 2009
January 24th, 2011 at 12:47:43 PM permalink
Quote: jfalk

I hate to disagreee with you, but there's a simple proof that they're the same thing: Let x=.999999... Let z=10x=9.9999999.... Then z-x=9 = 10x-x = 9x. So x=1.



Without addressing that, if I may play the offense for a moment, what would you say is the probability of throwing a dart between 0 and 10 and not hitting pi?
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
Asswhoopermcdaddy
Asswhoopermcdaddy
  • Threads: 88
  • Posts: 570
Joined: Nov 30, 2009
January 24th, 2011 at 1:17:05 PM permalink
Quote: DJTeddyBear


Hell, if you start with a $1 bet, you only need 50 losses before the bet is so big, that the average person can't say it, because they (like me) do not know what comes after "trillion": $1,125,899,906,842,624.


50 losses, while unlikely, is not unrealistic.



Quadrillion I believe.
jfalk
jfalk
  • Threads: 2
  • Posts: 29
Joined: Sep 2, 2010
January 24th, 2011 at 1:18:30 PM permalink
I would say it is an event of measure one. That means that the event of hitting pi is an event of measure zero. Measure and probability are not quite identical though.
Asswhoopermcdaddy
Asswhoopermcdaddy
  • Threads: 88
  • Posts: 570
Joined: Nov 30, 2009
January 24th, 2011 at 1:24:53 PM permalink
Quote: Wizard

Let me try to address jfalk's post. Let's say the Martingale player always bets on red. I think JF would say that the expected number of reds would be infinite. You would need to avoid them infinitely for the Martingale player to lose. The probability of never getting a red is "zero." However, I would argue it is a soft zero, that approaches as close as possible to zero, without actually getting there. Like the probability of hitting pi if you threw a dart between 0 and 10. It is that difference between a soft and hard zero that I claim makes the difference.



I agree with you here. You can approach a number but never truly reach it. It's like saying 1 divided by infinity. Well, that doesn't equal -0-, but it approaches zero. And the slight subtle distinction can't be treated equivalently.

You could bet red indefinitely, but the outcome could be black indefinitely. The probability of an occurence should not be confused with an actual occurrence, even if you ram it to infinity.
Doc
Doc
  • Threads: 46
  • Posts: 7287
Joined: Feb 27, 2010
January 24th, 2011 at 1:24:57 PM permalink
Quote: Wizard

Quote: jfalk

I hate to disagreee with you, but there's a simple proof that they're the same thing: Let x=.999999... Let z=10x=9.9999999.... Then z-x=9 = 10x-x = 9x. So x=1.

Without addressing that, if I may play the offense for a moment, what would you say is the probability of throwing a dart between 0 and 10 and not hitting pi?

It has been a long time since I have studied these things. I think I recall the discussion about whether .99999... equals 1.0 leads to the conclusion that it approaches 1.0 as a limit as the number of decimals increases without limit. However, you must actually reach infinity (not considered possible) for them to be identically equal.

As for the z=10x argument, I think that going from z=9.9999... to z-x=9 disregards whatever happens as you approach infinity (how many total digits are in z and in x?) and essentially assumes what is to be proved.

I agree with the Wizard's "dart hitting pi" example. However, it is more complex than some of the other examples, since the irrational numbers are uncountably infinite. I think I prefer the example of positive integers mentioned before. That is an infinite set; what is the probability that a randomly chosen positive integer is equal to 21? I think the probability is 1/infinity, or as close to zero as you could perceive it to be, even though a random choice could possibly result in 21.

Edit: Looks as if my points were made while I was typing.
Jufo81
Jufo81
  • Threads: 6
  • Posts: 344
Joined: May 23, 2010
January 24th, 2011 at 1:33:46 PM permalink
Quote: Doc


I agree with the Wizard's "dart hitting pi" example. However, it is more complex than some of the other examples, since the irrational numbers are uncountably infinite. I think I prefer the example of positive integers mentioned before. That is an infinite set; what is the probability that a randomly chosen positive integer is equal to 21? I think the probability is 1/infinity, or as close to zero as you could perceive it to be, even though a random choice could possibly result in 21.



This is true but the problem is: who do you choose an integer from all positive integers so that each one is equally likely to be picked (ie. each is picked with probability zero). What would the picking process be?
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 1:37:13 PM permalink
Quote: Wizard

To argue that is the point of this thread. I claim otherwise.


Well, mathematically, it is limitN->0(p^N), which is 0. I don't see very many ways to look at this.
If math isn't convincing, perhaps, you could try running a simulation to see how many times in 10 billion (or whatever) a player with unlimited bankroll will lose? :)



int iter=0,win=0,br=0,bet=1;
while(1)
{
iter++;
while(br <= 0) { br += bet * (rand() < 0.45 ? -1 : 1); bet *=2; }
win++;
printf("%d Iterations. Win rate: %6.2d%%\n", iter, 100.0*win/iter);
}


I understand that you have some programming experience. Can you guess what the output of this program will be?
Do you believe it might change at some point if you let it run for very long time?
"When two people always agree one of them is unnecessary"
Asswhoopermcdaddy
Asswhoopermcdaddy
  • Threads: 88
  • Posts: 570
Joined: Nov 30, 2009
January 24th, 2011 at 1:47:52 PM permalink
Quote: weaselman

Well, mathematically, it is limitN->0(p^N), which is 0. I don't see very many ways to look at this.
If math isn't convincing, perhaps, you could try running a simulation to see how many times in 10 billion (or whatever) a player with unlimited bankroll will lose? :)



int iter=0,win=0,br=0,bet=1;
while(1)
{
iter++;
while(br <= 0) { br += bet * (rand() < 0.45 ? -1 : 1); bet *=2; }
win++;
printf("%d Iterations. Win rate: %6.2d%%\n", iter, 100.0*win/iter);
}


I understand that you have some programming experience. Can you guess what the output of this program will be?
Do you believe it might change at some point if you let it run for very long time?



Disagree with using a 10 billion (finite sample) to measure the number of times a player with an infinite bankroll will lose. The probability of still losing in this case is (1/2)^10 billion, massive jaw numbing number. But it's there, however small.

Will the Wizard opine on the following which is related, but a corollary of this discussion:

What is the impact of infinity on the following types of games:
1.) Positive Expectations - win
2.) Neutral -0- Expectations - ??????
3.) Negative Expectations - lose
Wizard
Administrator
Wizard
  • Threads: 1520
  • Posts: 27120
Joined: Oct 14, 2009
January 24th, 2011 at 1:56:19 PM permalink
Quote: jfalk

I would say it is an event of measure one. That means that the event of hitting pi is an event of measure zero. Measure and probability are not quite identical though.



Likewise, I would say the "measure" of the chances of ever getting a red in an infinite roulette game is 1. However, the probability is not 1. If the probability is not a true 1 then the infinite Martingale player might lose.
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
weaselman
weaselman
  • Threads: 20
  • Posts: 2349
Joined: Jul 11, 2010
January 24th, 2011 at 2:29:10 PM permalink
Quote: Wizard

Likewise, I would say the "measure" of the chances of ever getting a red in an infinite roulette game is 1. However, the probability is not 1. If the probability is not a true 1 then the infinite Martingale player might lose.



You do realize, that the game with this hypothetical infinitely long streak would never end, right? If the game is still in progress, how can you conclude that the player lost?

Quote: Asswhoopermcdaddy


Disagree with using a 10 billion (finite sample) to measure the number of times a player with an infinite bankroll will lose. The probability of still losing in this case is (1/2)^10 billion, massive jaw numbing number. But it's there, however small.


No, 1/2^10 billion is the probability that a single iteration (inner loop in my code snipped) will take that long to complete (if this was a fair game), but that's not what we are taking about.
The probability we are looking for is that the program will print out anything other than 100.00% after however long you want to let it run (an hour, a year, a century, the entire age of the universe, the number of years, equal to the age of universe measured in nanoseconds ... etc., etc.). The probability of this event is exactly 0. You see, unlike the actual game, the program is deterministic - if you read the code carefully, you'll see that there is simply no way for it to print out anything other than 100.00%.
This is another way to prove, what is easily seen from that limN->inf(p^N)=0 equation I quoted earlier.
Frankly, I simply don't see any ground for disagreement here, so, I guess, I am just goint to have to shut up at this point, because I don't really have anything to add.
"When two people always agree one of them is unnecessary"
boymimbo
boymimbo
  • Threads: 17
  • Posts: 5994
Joined: Nov 12, 2009
January 24th, 2011 at 2:45:03 PM permalink
Quote: Wizard

Without addressing that, if I may play the offense for a moment, what would you say is the probability of throwing a dart between 0 and 10 and not hitting pi?

.

Actually, according to Heisenburg's Uncertainty principle, the dart is much bigger than the size pi and takes up a finite space between 0 and ten, and therefore, the probability of hitting pi could be measured. If the dart was extremely small, then measuring the position of the dart affects the actual measurement itself and therefore, you could only say with a percentage of certainty that the dart hit pi.

Fun, Wow!

But we're talking about a wheel and numbers. I posit that over an infinite amount of spins, the distribution must approach the theoretical probabilities according to the real odds. That is, if you have an infinite bankroll, and an infinite amount of spins, betting flat, you will lose exactly 2/38th of your money. The central limit theorem would predict this.

In any case, infinity really means nothing. Everything in this experiment is finite and measurable.
----- You want the truth! You can't handle the truth!
rxwine
rxwine
  • Threads: 218
  • Posts: 12699
Joined: Feb 28, 2010
January 24th, 2011 at 2:47:47 PM permalink
Quote: weaselman

If the game is still in progress, how can you conclude that the player lost?



Well, without reading the rest of the thread, that makes sense to me. Raising the bet is an open ended process, and the game never ends. However if the player hits his mark he's back to start over or quit.

Now I should read the rest of the thread. Hah.
Sanitized for Your Protection
  • Jump to: