January 4th, 2013 at 9:56:55 AM
permalink
When playing many board games, the first step is to have everyone roll a die to see who goes first, with a roll off in the case of a tie. While doing that over the Christmas break, my husband suggested that we roll two dice instead of one, with the assertion that this would make ties less likely. My brother disagreed, claiming it wouldn't make any difference. I'm interested in investigating this question.
I've been able to calculate the probability of ties for the case of rolling one die for any number of players, and the case for rolling two dice with two players. However, I haven't actually found a general solution in either case (I mostly used a brute force approach in the one die case). Is anyone here aware of any sources that have investigated this issue?
(For the record, I'm pretty sure both my husband and brother have forgotten the conversation, so you don't need to worry about hurting anyone's feelings. :) )
I've been able to calculate the probability of ties for the case of rolling one die for any number of players, and the case for rolling two dice with two players. However, I haven't actually found a general solution in either case (I mostly used a brute force approach in the one die case). Is anyone here aware of any sources that have investigated this issue?
(For the record, I'm pretty sure both my husband and brother have forgotten the conversation, so you don't need to worry about hurting anyone's feelings. :) )
January 4th, 2013 at 11:29:35 AM
permalink
About 11.3%
(1/36 x 1/36, + 2/36 x 2/36, + 3/36 x 3/36, + 4/36 x 4/36, + 5/36 x 5/36) x 2 + 6/36 x 6/36 = around 11.3%
(1/36 x 1/36, + 2/36 x 2/36, + 3/36 x 3/36, + 4/36 x 4/36, + 5/36 x 5/36) x 2 + 6/36 x 6/36 = around 11.3%
January 4th, 2013 at 11:39:32 AM
permalink
But what if there's X players?
Thinking about it, it must be lower as only one number (7) is as common as all the numbers on a single die.
Thinking about it, it must be lower as only one number (7) is as common as all the numbers on a single die.
"Then you can admire the real gambler, who has neither eaten, slept, thought nor lived, he has so smarted under the scourge of his martingale, so suffered on the rack of his desire for a coup at trente-et-quarante" - Honore de Balzac, 1829
January 4th, 2013 at 11:45:37 AM
permalink
Quote: hkellymy husband suggested that we roll two dice instead of one, with the assertion that this would make ties less likely.
Your husband is right, but I'd have trouble with the proof.
the next time Dame Fortune toys with your heart, your soul and your wallet, raise your glass and praise her thus: “Thanks for nothing, you cold-hearted, evil, damnable, nefarious, low-life, malicious monster from Hell!” She is, after all, stone deaf. ... Arnold Snyder
January 4th, 2013 at 12:10:19 PM
permalink
I'm pretty sure my husband is right, and for similar reasons to those mentioned by thecesspit. However, what I'm looking for is mathematical proof. And, I agree, I'm having a LOT of trouble with that proof.
January 4th, 2013 at 12:37:30 PM
permalink
I think SOOPOO proved it above -- he gave the probability of two players having a tie with 2 dice. The probability for one die is just 1/6
January 4th, 2013 at 1:54:20 PM
permalink
For two players, yes. What about more players?
January 4th, 2013 at 2:07:30 PM
permalink
What are you looking for? A closed-form solution for the probability of a tie given n players, or just a proof that a tie is less likely with 2 dice than 1 for any number of players n > 1? I suspect that there might be a shortcut to make the latter easier (it seems intuitively obvious)
January 4th, 2013 at 3:59:23 PM
permalink
It does seem intuitively obvious. I'd love a general formula for the probability of a tie with n dice for m players, but I'd settle for proof of the nature you mentioned.
January 4th, 2013 at 4:45:03 PM
permalink
https://wizardofodds.com/games/beat-the-dealer/
I tipped the Wizard off to this game at Turning Stone Casino last year. The dealer rolls two dice then you shoot two dice, higher number wins and ties go to the dealer. Wiz calculated an 11.26% House Edge on this game so this is the answer to your question.
I tipped the Wizard off to this game at Turning Stone Casino last year. The dealer rolls two dice then you shoot two dice, higher number wins and ties go to the dealer. Wiz calculated an 11.26% House Edge on this game so this is the answer to your question.
January 4th, 2013 at 5:10:04 PM
permalink
Quote: winmonkeyspit3https://wizardofodds.com/games/beat-the-dealer/
I tipped the Wizard off to this game at Turning Stone Casino last year. The dealer rolls two dice then you shoot two dice, higher number wins and ties go to the dealer. Wiz calculated an 11.26% House Edge on this game so this is the answer to your question.
It's pretty sick that that game made it to the floor. Losing half on a tie, going to "War" etc. still give a hefty edge.
January 4th, 2013 at 6:58:23 PM
permalink
For two players, yes. What about for more?
January 4th, 2013 at 7:03:16 PM
permalink
Quote: hkellyFor two players, yes. What about for more?
Each individual player plays against the Dealer, thus, it is unchanged.
https://wizardofvegas.com/forum/off-topic/gripes/11182-pet-peeves/120/#post815219
January 4th, 2013 at 8:48:43 PM
permalink
I can confidently say the odds of a tie are lower the more dice you roll. This will hold true for any number of players.
Hopefully nobody will ask for a proof.
Hopefully nobody will ask for a proof.
"For with much wisdom comes much sorrow." -- Ecclesiastes 1:18 (NIV)
January 4th, 2013 at 8:57:09 PM
permalink
I think we all know that :) But, I can't think of a proof...
January 10th, 2013 at 4:27:28 AM
permalink
I do.Quote: Wizard
Hopefully nobody will ask for a proof.
Let's build up one together.
1. First (though it isn't asked), it can be proven trivially that the odds of tie are higher, for a fixed number of dice, if you increase the number of contestants.
PROOF : Consider a N-player no-tie situation. Add a player. The probability that a tie arises is the prob of getting one of the previous results, which is strictly positive if N>=1. So the probability of maintaining a no-tie situation is less than one. Adding up for all the N-pl. no-tie situations, you multiply each of their probabilities by something lower than one, hence the overall probability of no-tie has been reduced.
Trivial but interesting, because (1°) it shows how backward computation is needed; (2°) Results are quite general. In particular, the ordering of results (which is necessary for the determination of a winner) is not used in the calculation of ties (imagine dice with colors instead of numbers). Consequently, we can re-order results from one draw to the other; in essence, we will want to order them by probability, from highest to lowest.
2. Second, intuition is not helpful. Most people would think: "the more different possible cases in a throw, the less probability of ties". This is not true for all distributions, as a straightforward example shows.
Two players draw at heads or tails: they have a .50 chance of tieing. They design a three-output random scheme, but where the probabilities are (.75, .125, .125). Now their probability of tieing is 38/64=.59375. Bad move! |
Thankfully, with dice, it can be shown that the mode (the most frequent case) does not have a higher probability in the N+1 dice than in the N dice throw.
1 die | 1/6 |
2 dice | 1/6 |
3 dice | 1/8 |
4 dice | 1/9+1/648 |
For more dice you can use a Central-Limit like theorem to get an approximate value for the modal probability as .2336/D^(.5) where D is the number of dice.
3. Notation :
- pi = probability of case i
- Sk = sum of pik (NB S1 = 1 )
N | P(tie) |
---|---|
2 | S2 |
3 | 3 S2 - 2 S3 |
4 | 6 S2 - 8 S3 + 6 S4 - 3 S22 |
5 | 10 S2 - 20 S3 + 30 S4 - 24 S5 - 15 S22 + 20 S2S3 |
6 | 15 S2 - 40 S3 + 90 S4 - 144 S5 + 120 S6 - 45 S22 + 120 S2S3 + 15 S23 - 90 S2S4 - 40 S32 |
(to be continued)
Reperiet qui quaesiverit
January 10th, 2013 at 5:12:21 AM
permalink
I have a proof. It's not quite mathematical, and it doesn't quite answer the original question as posed, but it may help with the proof.
Rolling 1 die
A maximum of 6 players are able to roll, with there still being a chance of having no tie.
7 or more players will be guaranteed to have a tie, as there are only 6 possible values.
Rolling 2 dice
A maximum of 11 players are able to roll, with there still being a chance of having no tie.
12 or more players will be guaranteed to have a tie, as there are only 11 possible values. (2 through 12)
Therefore, more dice = more possible values, lessening the chance of having a tie.
Rolling 1 die
A maximum of 6 players are able to roll, with there still being a chance of having no tie.
7 or more players will be guaranteed to have a tie, as there are only 6 possible values.
Rolling 2 dice
A maximum of 11 players are able to roll, with there still being a chance of having no tie.
12 or more players will be guaranteed to have a tie, as there are only 11 possible values. (2 through 12)
Therefore, more dice = more possible values, lessening the chance of having a tie.
-Dween!
January 10th, 2013 at 7:51:30 AM
permalink
I suddenly have a stress.
Is it the probability of ANY tie among any players, or is it the probability of a tie for HIGHEST only?
The latter would prove definitely more difficult to compute, I guess (but, as I said, intuition is misleading).
Is it the probability of ANY tie among any players, or is it the probability of a tie for HIGHEST only?
The latter would prove definitely more difficult to compute, I guess (but, as I said, intuition is misleading).
Reperiet qui quaesiverit
January 10th, 2013 at 5:42:19 PM
permalink
Quote: kubikulannI suddenly have a stress.
Is it the probability of ANY tie among any players, or is it the probability of a tie for HIGHEST only?
The latter would prove definitely more difficult to compute, I guess (but, as I said, intuition is misleading).
Highest only.
It's intuitively obvious, but any proof method that I can think of off the top of my head is just a pile of numbers and notation, which is enough to remind myself that I don't really care and I'd rather spend my free time doing something else.
January 10th, 2013 at 5:57:22 PM
permalink
I think induction is the easiest way to think of it.
As has been explained, increasing the number of players makes it more likely to find a tie since a round without any tie with N people can only occur if the first (N-1) haven't tied then the Nth person doesn't.
This isn't mathematically comprehensive but intuitively.....
If the number of dice increases, then except for 1 to 2 die, the probabilities for all results near the mean are smaller and there are more possible results. So after the 1st person throws, the 2nd person should (on average) have a smaller chance of a tie if more die are used. The 3rd person comes along, his chances of a tie are the sum of the first two, which is lower with more die. etc.
The dodgy bit - the probability of the median result decreases as the number of die increase. the probability of the adjacent result ...., but at some stage the probability is greater. !help!
As has been explained, increasing the number of players makes it more likely to find a tie since a round without any tie with N people can only occur if the first (N-1) haven't tied then the Nth person doesn't.
This isn't mathematically comprehensive but intuitively.....
If the number of dice increases, then except for 1 to 2 die, the probabilities for all results near the mean are smaller and there are more possible results. So after the 1st person throws, the 2nd person should (on average) have a smaller chance of a tie if more die are used. The 3rd person comes along, his chances of a tie are the sum of the first two, which is lower with more die. etc.
The dodgy bit - the probability of the median result decreases as the number of die increase. the probability of the adjacent result ...., but at some stage the probability is greater. !help!
January 10th, 2013 at 6:38:16 PM
permalink
I understand what you are saying, but none of this comes close to being a proof. As I said, it's intuitively obvious... I understand "why" it's true intuitively. But that's different from a formal proof.
January 14th, 2013 at 8:06:26 AM
permalink
(Continuation)
By brute force I have computed these first probabilities of a tie (NB: of any tie, not just for highest).
So for these low values, the intuition is confirmed. The improve is rather small from 2 to 3 dice.
But I wouldn't vouch for higher values, especially when the number of players goes over 7 or 8: it might require several dice before the probability actually goes down. I don't know.
Yet, it is interesting to notice, from a practical point of view, that by using the old technique of rolling one die and re-rolling when a tie occurs, you have a more efficient technique than rolling two dice at the onset.
Results should be even better, since you typically only re-roll those who tied.
Practically, what you'd do is roll two coloured dice and not sum their values but use it lexicographically (i.e. like tens and units).
Does it remain true after 4 players? I don't know. Anyway, you see that with 4 players, the P of tie is quite high (about 50%), so it is probably more useful to design another mechanism for ordering players, like drawing cards. (If you just need one winner designated, maybe figures are more encouraging...)
By brute force I have computed these first probabilities of a tie (NB: of any tie, not just for highest).
Players | 1 Die | 2 Dice | 3 Dice |
---|---|---|---|
2 | 1/6=.16667 | 146/362=.11265 | 4332/2162=.09285 |
3 | 16/36=.44444 | 2406/65=.30941 | 2610144/69=.25900 |
4 | 26/36=.72222 | 24994/66=.53571 | 4631137/69=.45954 |
But I wouldn't vouch for higher values, especially when the number of players goes over 7 or 8: it might require several dice before the probability actually goes down. I don't know.
Yet, it is interesting to notice, from a practical point of view, that by using the old technique of rolling one die and re-rolling when a tie occurs, you have a more efficient technique than rolling two dice at the onset.
Players | 1 Die + 1 re-roll |
---|---|
2 | .02778 |
3 | .19753 |
4 | .52161 |
Practically, what you'd do is roll two coloured dice and not sum their values but use it lexicographically (i.e. like tens and units).
Does it remain true after 4 players? I don't know. Anyway, you see that with 4 players, the P of tie is quite high (about 50%), so it is probably more useful to design another mechanism for ordering players, like drawing cards. (If you just need one winner designated, maybe figures are more encouraging...)
Reperiet qui quaesiverit
January 21st, 2013 at 5:59:41 AM
permalink
I'm not sure anyone is reading this...
Well, for those interested:
I wonder whether this has already been done or I have stumbled (well, climbed the hard way...) on something new?
Well, for those interested:
- I have managed to find a general formula for the probability of tie for winner, for any number of players and any probability distribution of results in the random draw (of which the dice roll is but a particular case).
This formula is a bitch to use (recursive on recursives). -
I have managed to prove that, in the limit (using the Central-limit theorem), increasing the number of dice reduces the probability of tie for winner. Proven for up to four players.
This is also proven by brute force for the low numbers, using the formula. ( The quality of the Normal approximation is quite good from 5 dice on: differential in probabilities is less than 0.0025)
. - I have computed that throwing two colored dice (tens and units) is about as efficient (in terms of probability of tie) as throwing 35 dice in one roll and comparing the sums. (Computed up to 4 players)
- If your dice are identical, then throwing two dice and ranking values lexicographically as (max, min) is still more efficient than comparing sums of 30 dice.(Computed up to 4 players)
I wonder whether this has already been done or I have stumbled (well, climbed the hard way...) on something new?
Reperiet qui quaesiverit