Thread Rating:
Obviously he doesn't understand math, but is trying to correct the father of your future bride worth arguing over something so trivial.
What do you gain by proving him wrong?
Unless, of course, your wife hates her dad and would love to see his smug ass put in his place.
When it comes to questions like this, the best approach is to ponder the situation and decide which answer will result in getting laid tonite.
If it's all in good fun and not a 'heated debate' then start at 0, as him if the coin is fair, weighted, or changed from a normal coin in any way... then say "let's pretend 9 heads just flipped in a row... here's the coin again... has anything changed to the coin? Has it's weight shifted? Did I change it in any way? .... *no* ...okay then how could it possibly be biased to land any differently than the past 9 times?"
Also, quote "Independent Trials" and show him what that means. This is similar to 9 blacks landing in a row on roulette and him thinking red is "due" to hit. Independent trials.
ask him then, "How much more likely to be Tails?"Quote: Poker769My future father in law and myself are discussing coin flips. He says that after a coin lands on heads 9 times in a row that the 10th flip is more likely to be tails.
and continue with this
after heads has been flipped 8 times in a row,
how much more likely is the next to be tails?
and is it the same as after 9 heads in a row??
or even this
after the 1st flip is Heads, how much more likely is the next flip to be Tails?
same as after 8 or 9 in a row??
can that be calculated??
not silly if one look at it both ways
Sally
Why is 9 the magic number, why not 5 or 8? Ask him how much more likely it is to come up, and then have him explain why that is.Quote: Poker769I think I have to just drop it also. If it were anyone other than family I would argue it until they saw but yeah I guess you guys are right. His argument was this is the same as the Monty Hall problem. But I tried to tell him it’s not the same.
If there were an easy way to get 9 in a row you could just ask for odds and make the bet.
My only view to the other side is this:Quote: RomesIf it's all in good fun and not a 'heated debate' then start at 0, as him if the coin is fair, weighted, or changed from a normal coin in any way... then say "let's pretend 9 heads just flipped in a row...
In response to the question "Is the coin fair?", I could only answer, "It is, so far as I know."
If you then told me that it has just been flipped with the result of heads 9 times in a row (a 1 chance in 512 event for a genuinely fair coin), then I might have a twinge of suspicion that it isn't really a fair coin. With no other information at all to go on, if I had to wager on the next flip, I would go with heads. I can think of no reason to suspect that a call of tails has any better than a 50-50 chance, and there is at least some small hint that the coin might be biased in favor of heads.
If the problem were revised to "the coin has just been flipped as heads 1,000 times in a row", would you really believe that it was a fair coin? At what point would you begin to suspect bias and let it influence your wager?
Quote: DocMy only view to the other side is this:
In response to the question "Is the coin fair?", I could only answer, "It is, so far as I know."
If you then told me that it has just been flipped with the result of heads 9 times in a row (a 1 chance in 512 event for a genuinely fair coin), then I might have a twinge of suspicion that it isn't really a fair coin. With no other information at all to go on, if I had to wager on the next flip, I would go with heads. I can think of no reason to suspect that a call of tails has any better than a 50-50 chance, and there is at least some small hint that the coin might be biased in favor of heads.
If the problem were revised to "the coin has just been flipped as heads 1,000 times in a row", would you really believe that it was a fair coin? At what point would you begin to suspect bias and let it influence your wager?
Yes this is under the assumption it’s a fair coin.
Quote: Poker769Yes this is under the assumption it’s a fair coin.
So how could a coin be both fair and not have 50/50 odds every single toss?
When arguing with someone you love or need to love, and you absolutely know you're right (and you are, of course):
The original advice was to apologize immediately.
I would add, if you just can't :
Agree to disagree
Or
Drop it.
There is no upside to being right about this.
And, I also agree, bet with the streak. That way you only lose once.
Quote: gamerfreakSo how could a coin be both fair and not have 50/50 odds every single toss?
I agree
I was on a plane next to a potential in-law who regaled me about how he got backed off from a blackjack table because he would bet a small amount, and if he lost he would double his bet until he won. He won so much money and it was so impossible for him to lose 15 hands in a row that the casino finally said ENOUGH and backed him off.
But at the same time, you shoulnd’t let them think you’re a door mat. The way I handled it was to just say, “oh thats called the martingale”, and just left it at that. Then they know that you know something they don’t know.
Quote: klimate10You don’t have to win every single argument. Sometimes, it’s better to keep the peace.
I was on a plane next to a potential in-law who regaled me about how he got backed off from a blackjack table because he would bet a small amount, and if he lost he would double his bet until he won. He won so much money and it was so impossible for him to lose 15 hands in a row that the casino finally said ENOUGH and backed him off.
But at the same time, you shoulnd’t let them think you’re a door mat. The way I handled it was to just say, “oh thats called the martingale”, and just left it at that. Then they know that you know something they don’t know.
I like this approach
Quote: DocMy only view to the other side is this:
In response to the question "Is the coin fair?", I could only answer, "It is, so far as I know."
If you then told me that it has just been flipped with the result of heads 9 times in a row (a 1 chance in 512 event for a genuinely fair coin), then I might have a twinge of suspicion that it isn't really a fair coin. With no other information at all to go on, if I had to wager on the next flip, I would go with heads. I can think of no reason to suspect that a call of tails has any better than a 50-50 chance, and there is at least some small hint that the coin might be biased in favor of heads.
If the problem were revised to "the coin has just been flipped as heads 1,000 times in a row", would you really believe that it was a fair coin? At what point would you begin to suspect bias and let it influence your wager?
I wish I would have had this advice when I saw 18 yos in a row and didn't bet any if them.
Quote: GWAEI wish I would have had this advice when I saw 18 yos in a row and didn't bet any if them.
18 yos in a row is insane. I’m sure the idiots cleaned up on that table
Quote: Poker769Who is right?
You are. You'll never win the argument though.
The more ridiculous a notion is, the more tenaciously it tends to be believed.
LOL. When I made my previous post, I did, indeed, think about the long-running "18 yos in a row" scenario on this forum. If I ever were playing craps and saw even 10 yos in a row (maybe even 5 or 6 in a row), I strongly suspect I would jump on the train and start hopping the yo until the streak ended. If it hit a couple more times, I would press my bet. If it continued, then somewhere along the line I would probably risk a parlay. Strings like that are just so far away from the norm that you almost have to believe that something is not truly random.Quote: GWAEI wish I would have had this advice when I saw 18 yos in a row and didn't bet any if them.
Quote: WizardYou are. You'll never win the argument though.
The more ridiculous a notion is, the more tenaciously it tends to be believed.
That actually works almost all the time. It’s gentle, and it leads him to the answer on his own.
And they usually still answer: Tails, it's due.Quote: MoscaI have helped people understand this by explaining, “You aren’t betting that the coin will come up heads ten times in a row. You are betting that it will come up heads ten times in a row after it has already come up heads nine times in a row. Once it’s already done that, what are the odds for the next flip?”
That actually works almost all the time. It’s gentle, and it leads him to the answer on his own.
Quote: russionHe is right.
This isn’t even a good troll attempt.
Or take a coin out of your pocket, tell him it’s landed on heads 9 times in a row and ask him if he’s interested in making a bet. Of course, you’re going to want to get some odds on the bet (IE: his $6 to your $5).
Quote: Poker769This isn’t even a good troll attempt.
It wasn't a troll attempt at all. Read his whole post.
The thing I always enjoy about people going with tails here is that tails is the WORST answer. The coin is probably weighted towards heads, and if not, then it's 50/50. So tails is not going to be MORE likely than heads barring some type of scam.
I agree you probably just have to let this go after you've spoken your piece. It's so hard for me to let stuff like this go, though, so I feel your pain.
Try the following. Spin a quarter on a table and see which comes up more often. With a reasonably large sample size - you will see more tails.
This is due to more weight on the heads side of the quarter. This works better with older quarters as the difference isn’t as great on the newer quarters.
Makes a good bar bet as worst case it’s 50/50
Supposed there is a biased coin where the probability of winning a bet on a toss equals p, where p>50%. Wins pay even money. You have $1 and the table minimum and maximum are both $1. You bet forever or until you are ruined. What is the probability of ruin?
Quote: WizardMath puzzle time!
Supposed there is a biased coin where the probability of winning a bet on a toss equals p, where p>50%. Wins pay even money. You have $1 and the table minimum and maximum are both $1. You bet forever or until you are ruined. What is the probability of ruin?
Wouldn't it depend on how much you started with? If I start with $1 then it would be high. If I have a billion to start then its low?
Quote: GWAEWouldn't it depend on how much you started with? If I start with $1 then it would be high. If I have a billion to start then its low?
I've gotta figure the RoR is pretty high. For the first few rounds you're betting 100%, 50%, and 33% of your bankroll, ASSUMING 3 wins in a row to start.Quote: Wizard...You have $1 and the table minimum and maximum are both $1...
Quote: linksjunkieActually it has a slight propensity towards tails.
Try the following. Spin a quarter on a table and see which comes up more often. With a reasonably large sample size - you will see more tails.
This is due to more weight on the heads side of the quarter. This works better with older quarters as the difference isn’t as great on the newer quarters.
Makes a good bar bet as worst case it’s 50/50
That's an interesting point!
Quote: WizardMath puzzle time!
Supposed there is a biased coin where the probability of winning a bet on a toss equals p, where p>50%. Wins pay even money. You have $1 and the table minimum and maximum are both $1. You bet forever or until you are ruined. What is the probability of ruin?
If the probability of heads is p with p>0.5, and you bet on heads each time, your probability of busting when you have only $1 remaining is [ 1 - sqrt( 1 - 4*p*(1-p) ) ] / (2p).
Your probability of busting with a bankroll of N dollars is the above expression raised to the Nth power.
Quote: ChesterDogQuote: WizardMath puzzle time!
Supposed there is a biased coin where the probability of winning a bet on a toss equals p, where p>50%. Wins pay even money. You have $1 and the table minimum and maximum are both $1. You bet forever or until you are ruined. What is the probability of ruin?I think I remember the solution.
If the probability of heads is p with p>0.5, and you bet on heads each time, your probability of busting when you have only $1 remaining is [ 1 - sqrt( 1 - 4*p*(1-p) ) ] / (2p).
Your probability of busting with a bankroll of N dollars is the above expression raised to the Nth power.
Isn't is just (1-p)/p for 1 unit bankroll and [(1-p)/p]^N for n unit bankroll, so for example if p = 0.51 and bankroll = 1, probability of bust is 0.49/0.51 = 0.9608
Actually your equation is exactly the same if you simplify it.
Quote: GWAEWouldn't it depend on how much you started with?
You start with $1.
Quote: Jufo81...Isn't is just (1-p)/p[/bfor 1 unit bankroll and [(1-p)/p]^N for n unit bankroll, so for example if p = 0.51 and bankroll = 1, probability of bust is 0.49/0.51 = 0.9608
Actually your equation is exactly the same if you simplify it...
Thanks! I agree.
Quote: Jufo81Isn't is just (1-p)/p for 1 unit bankroll and [(1-p)/p]^N for n unit bankroll, so for example if p = 0.51 and bankroll = 1, probability of bust is 0.49/0.51 = 0.9608
Actually your equation is exactly the same if you simplify it.
Yes, you're right. I'll take your words for it the CD's answer was right too, but the most simplified form is always preferred. How about I each of you half a beer (I'm so cheap).
Quote: MrGoldenSunWhat's the solution? I've been trying to think of a way to solve it simply by considering EV somehow, but I can't come up with anything. I guess maybe it is some type of Markov chain deal, but I've been out of school way too long to remember much about that.
Solution to what?
Quote: MrGoldenSunWhat's the solution? I've been trying to think of a way to solve it simply by considering EV somehow, but I can't come up with anything. I guess maybe it is some type of Markov chain deal, but I've been out of school way too long to remember much about that.
Here's a solution:
Let p = the probability of heads where p > 0.5. Assume you always bet on heads.
Let b(N) = the probability of busting with $N remaining. When N=0, you've busted, so b(0) = 1.
The probability of busting with $1 remaining is p times the probability of busting with $2 plus (1-p) times the probability of busting with $0. That is: b(1) = p*b(2) + (1-p)*b(0), or b(1) = p*b(2) + 1 - p
Busting with $2 remaining can be thought of as two independent events, both of which are busting with $1 remaining. Therefore, b(2) = b(1) * b(1).
So:
b(1) = p * b(1) * b(1) + 1 - p.
Solve for b(1) using the quadratic formula, and simplify.
b(1) = ( 1 - p ) / p
Busting with $N remaining can be thought of N independent events, so:
b(N) = [ (1 - p) / p ] ^ N
it comes right from the Gambler's Ruin formulaQuote: MrGoldenSunWhat's the solution?
"Thus, unless the gambles are strictly better than fair (p > 0:5), ruin is certain.
Proof : If p > 0:5, then q/p < 1; hence in the denominator of (1), ( q/p )^N --> 0 yielding the result.
If p < 0:50, then q/p > 1; hence in the the denominator of (1), ( q/p )^N -->infinity yielding the result.
Finally, if p = 0:5, then pi(N) = i/N --> 0."
from the pdf
Gambler's Ruin Problem
2009 by Karl Sigman
in my online microsoft folder
http://1drv.ms/1qXMHx4
named
stochastic-I-GRP.pdf
for some easy 'good fun' reading
Sally
Quote: MrGoldenSunWhat's the solution? I've been trying to think of a way to solve it simply by considering EV somehow, but I can't come up with anything. I guess maybe it is some type of Markov chain deal, but I've been out of school way too long to remember much about that.
Look at Jufo's equation for p and n above. You insert the observed probability of the unfair coin (maybe .6 if you've seen "tails" 60 times in 100 flips) for p, N is the size of your bet, and n is the size of your bankroll.
Then you can figure out what your chance is of not losing your bankroll on each flip. But there's no one absolute solution. The answer explains how you calculate it for any possible situation.
The more biased the coin is, assuming you're aware of which way it favors and are betting on that, the better your chances are of winning.
Multiplied by the larger your bankroll is, the more times you can take advantage of your edge, and come out ahead in the end. If n= 0 at any point, you lose.
If n is greater than 0, you stop when you have the results you want: maybe you're trying to double your bankroll, so you play until then. But here, you're determining specifically the chance that bankroll will reach 0, and you will have to stop playing.
Quote: ChesterDogHere's a solution: ... Busting with $2 remaining can be thought of as two independent events, both of which are busting with $1 remaining ... Busting with $N remaining can be thought of N independent events
Thanks very much. For some reason, it didn't click to me that b(N)=b(1)^N, I guess maybe because my brain was thinking of how b(N) is not =(1-p)^N (flipping N consecutive tails), so I got away from that logic of simply multiplying probabilities?
I should have been able to figure it out, and I'm going to tell myself I would have gotten it if I had worked on it a bit more. :)
Quote: Ace2Let’s assume you don’t go bust and after flat betting $1 for a while you’re up $100. Then you decide you want to maximize your gains. How much do you bet going forward?
My guess:
bankroll * (2p-1)/[(p * (1 - (2p-1))^2 + (1-p) * (-1 - (2p-1)^2)]
Quote: Poker769I think I have to just drop it also. If it were anyone other than family I would argue it until they saw but yeah I guess you guys are right. His argument was this is the same as the Monty Hall problem. But I tried to tell him it’s not the same.
I've tried explaining things like this to people who don't understand, and I've come to realize that it is a fruitless endeavor.
I'll explain it once, and if they insist on believing in the gambler's fallacy, okay fine then, be that way. Not worth my time.
Quote: Ace2It should just be (2p -1) * bankroll.
Wouldn't you take into account the variance, though? That's what the divisor is, or supposed to be, in my math above.
BR * EV/SD = Wager
Quote: Ace2I think the kelly formula is simply the edge over the payoff.
Yes the Kelly formula for the fraction of the bankroll to be bet is the edge divided by the odds paid.
In the coin flipping example there is only one payout, but in other games with several different payouts such as in blackjack, using edge over variance is a good approximation for Kelly provided that the edge is small.