- Two players are each given a random number drawn from a uniform distribution from 0 to 1.
- Player 1 may keep his number or switch it for a new random number.
- Player 2, knowing player 1's decision, may also switch or stick with his original number.
- The higher final number at the end wins.
Questions:
- What is the optimal strategy for each player?
- Assuming both players follow optimal strategy, what the probability of winning for each player?
Quote: WizardI haven't posted a math problem for a while, so enjoy.
- Two players are each given a random number drawn from a uniform distribution from 0 to 1.
- Player 1 may keep his number or switch it for a new random number.
- Player 2, knowing player 1's decision, may also switch or stick with his original number.
- The higher final number at the end wins.
Questions:
- What is the optimal strategy for each player?
- Assuming both players follow optimal strategy, what the probability of winning for each player?
Player 1 should switch his number if it is less than 0.5
Player 2 should switch his number if it is less than player 1's number.
Player 1 will have a 2/3 chance of his outcomes to be in the range of 0.5 to 1.0 and 1/3 chance of his outcome to be 0.0 to 0.5
The average outcome for player 1 will be 0.25 * (1/3) + 0.75 * (2/3) = 0.5833333
Player 2 will, on average redraw when his outcome is below 0.58333333 which will be 58.33% of the time instead of player 1's 50%.
His average chance to beat player 1 will be the same 100% - 58.33% each time he is given a random outcome.
41.666% chance to win on the first draw on average and another 41.66% chance to win on the second draw -- on average.
.4166 + ( 0.5833 * .4166 ) = 1.4165 or 41% player advantage to player 2.
I'd rather be player 2 for sure.
Player 1 wins 29.175% of the time
Player 2 wins 70.825% of the time
Quote: AhighPlayer 2 should switch his number if it is less than player 1's number.Quote: WizardPlayer 2, knowing player 1's decision, may also switch or stick with his original number.
Reread Mike's post.
Player 2 doesn't know player 1's number, just his decision.
Wild Ass Guess:
Player 1 should redraw if his number is less than .666
Player 2 should redraw if his number is less than .666 if player 1 redrew, or less than .8 if he did not redraw.
The math for this is beyond me which is why it was a WAG.
Therefore, don't look at me for the math on the odds.
The chance of each player winning is the same at 50% in this case. You need to know the actual outcome for it to not be 50/50, not just that he got another random number.
Both players should re-draw if the number is less than 0.5 in this case, and neither player has an advantage.
Average outcome will be 0.583333 for both players.
Quote: AhighPlayer 1 should switch his number if it is less than 0.5
Thanks isn't the strategy change point.
Quote:Player 2 should switch his number if it is less than player 1's number.
He wouldn't know what player 1's number is.
Quote:Wild Ass Guess:
Player 1 should redraw if his number is less than .666
Player 2 should redraw if his number is less than .666 if player 1 redrew, or less than .8 if he did not redraw.
Nope. You're unlikely to be right wild guesses.
Quote:Yeah, if player two only knows if player 1 decided to redraw, that doesn't help player 2 at all. The fact that a redraw was done changes nothing.
Nope. If the player 1 switches player 2 can at least infer the new number is randomly distributed from 0 to 1. If player 1 stands then player 2 can assume it is a decent number.
Let Player 1's first draw be x. If she re-draws, the odds of a higher number is (1-x)*100%. The probability of winning is x, which is the odds that Player 2's number is in the range lower than x.
Player 1 then wins with probability x if she sticks with her number, and wins with probability 0.5 (in expectation) if she re-draws. She should re-draw if her number is less than 0.5.
Player 2 draws number y. If Player 1 does not re-draw, then Player 2 knows that Player 1's first draw, x, was above 0.5. Then Player 2 should re-draw if y<0.5. However, her decision depends on the value of y and not on Player 1's decision. She should also withdraw if y<0.5 and Player 1 also re-drew. She has a (1-y)*100% probability of getting a higher y_2 on the second draw.
Strategies: Both players re-draw if their initial draw is less than 0.5, else they stand. Both players have a 50% chance of winning.
Quote: AhighYeah, if player two only knows if player 1 decided to redraw, that doesn't help player 2 at all. The fact that a redraw was done changes nothing.
That's not true at all.
If player 1 redraws, player 2 knows that player 1's number is chosen uniformly from [0,1].
If player 1 doesn't redraw, player 2 knows that player 1's number is one of the numbers that he won't redraw on. Presumably this will skew higher than the uniform distribution, therefore, player 2 has to redraw more aggressively when player 1 does not redraw.
There is probably some room for bluffing here -- The correct strategy for player 1 may be a mixed strategy. However, not drawing is still going to give up some information.
Quote: harvson3Strategies: Both players re-draw if their initial draw is less than 0.5, else they stand. Both players have a 50% chance of winning.
There is a way for player 2 to improve on this strategy if that is player player 1 does. That is a decent strategy, but it is far from optimal for either player.
Quote: harvson3Player 2 draws number y. If Player 1 does not re-draw, then Player 2 knows that Player 1's first draw, x, was above 0.5. Then Player 2 should re-draw if y<0.5.
No, that's not true. If Player 1 is following a "stand on >=0.5" strategy, then player 2 needs to redraw more aggressively.
Eg, If Player 1 stands, and player 2 has 0.6, standing gives a 20% chance of winning and redrawing gives a 25% chance of winning.
It's also not clear that player 1's best strategy is to stand on only numbers higher than 0.5. That certainly maximized the expectation for his number, but not necessarily his probability of winning, since the stand / no-stand decision gives some information away.
Quote: vetsenIf player 1 stands it would increase player 2s redraw range because it tells him player 1 likely has a better than average number. I would guess the logic for the solution is somewhat similar to heads-up poker. In this case, player 2 has position, which we all know is valuable.
Hmmm. So in the case that player 1 doesn't redraw, the average expectation for player 1's draw is 0.75 if you assume player 1 is only going to redraw based on a positive expectation to improve the result from 0.5 on average.
In that case, when player 1 does not redraw, player 2 should redraw if their initial draw is below 0.75.
That will result in 0.875 * ( 1/4 ) + 0.5 * ( 3/4 ) = 0.59375 average result for player 2 in the case of player one standing.
When player 1 doesn't stand, we still have the 0.58333 result for player two assuming that they assume an EV of 0.5 for player 1 in that case.
That's an advantage in the case when player 1 doesn't redraw because player 2 gets to redraw based on the knowledge that player 1 has a result better than 0.5. All of this is predicated on player 2 knowing that player 1 is going to attempt to play optimally and redraw any time the result is such that a redraw is to his advantage.
Of course knowing this advantage exists means that player 1 will redraw more frequently to compensate. So we will likely have some recursion to the point of balance.
Quote: SOOPOOPlayer 1. I redraw if under .59
Player 2. If player 1 redraws I redraw if under .5
If player 1 does not redraw I redraw if under .69
Player 2 wins 56% of the time.
Best answer so far, but still not optimal. Also, I'd prefer to see a solution with the answer. Somebody could figure this out with trial and error, which is what I suspect you did. To get full credit, a mathematical solution must be provided.
At this point, I'd like to suggest the future answers and/or solutions be put in spoiler tags.
The probability that x > P2's original number is 1-x.
The probability that a new number > P2's original number is the number x such that the integral from 0 to x of ((1-t) dt)) = 1/2 of the integral from 0 to 1 of ((1-t) dt)); the latter is 1/2, so the former becomes x - x2/2 = 1/2 -> x = sqrt(2)/2.
If P1 keeps his original number, P2 switches if the number < sqrt(2)/2.
If P1 switches his original number, P2, using the same strategy P1 originally used, also switches if the number < sqrt(2)/2.
In any case, both players use the same strategy: switch if the number < sqrt(2)/2.
Case 1: both players keep their original numbers (probability (sqrt(2)/2)2 = 1/2); each player has probability 1/2 of winning.
Case 2: one player keeps his original number, while the other switches (probability sqrt(2) x (1-sqrt(2)/2) = sqrt(2)-1) - since it is equally likely for P1 and P2 to be the "keeper" and the other the "switcher", each player's probability of winning is the same.
Case 3: both players switch (probability (1-sqrt(2)/2))2 = (3/2 - sqrt(2)); each player has probability 1/2 of winning.
In each case, each player's probability of winning is 1/2, so each player's total probability of winning is 1/2.
Something tells me my answer to #2 has a flaw in it somewhere.
Quote: ThatDonGuySomething tells me my answer to #2 has a flaw in it somewhere.
So does #1. To give a hint, the strategies are not the same and the second player has a positional advantage, making the probability player 1 wins, assuming optimal strategy for both players, less than 50%.
Quote: AhighHmmm. So in the case that player 1 doesn't redraw, the average expectation for player 1's draw is 0.75 if you assume player 1 is only going to redraw based on a positive expectation to improve the result from 0.5 on average.
In that case, when player 1 does not redraw, player 2 should redraw if their initial draw is below 0.75.
That will result in 0.875 * ( 1/4 ) + 0.5 * ( 3/4 ) = 0.59375 average result for player 2 in the case of player one standing.
When player 1 doesn't stand, we still have the 0.58333 result for player two assuming that they assume an EV of 0.5 for player 1 in that case.
That's an advantage in the case when player 1 doesn't redraw because player 2 gets to redraw based on the knowledge that player 1 has a result better than 0.5. All of this is predicated on player 2 knowing that player 1 is going to attempt to play optimally and redraw any time the result is such that a redraw is to his advantage.
Of course knowing this advantage exists means that player 1 will redraw more frequently to compensate. So we will likely have some recursion to the point of balance.
Player 1: I redraw based on the only knowledge -- existing value versus expected value for next draw. EV resulting combination is 0.58333.
Player 2: I redraw based on knowing EV for player 1 is 0.5833 whether he redraws or not does not matter. All I can know is that he has the ability to have a higher EV if he so chooses and I have to do the exact same thing to compete -- my choice to draw based on if I can do better than MY previous draw. I know nothing of his draw by knowing the he did another draw. I cannot improve my EV greater than 0.58333.
Chance of each player winning is still 50/50.
If player 2 knows the outcome of player 1's hand, everything changes. But player 2 only knows the best possible EV that player 1 can achieve and nothing else.
Quote: Ahigh
Player 1: I redraw based on the only knowledge -- existing value versus expected value for next draw. EV resulting combination is 0.58333.
Player 2: I redraw based on knowing EV for player 1 is 0.5833 whether he redraws or not does not matter. All I can know is that he has the ability to have a higher EV if he so chooses and I have to do the exact same thing to compete -- my choice to draw based on if I can do better than MY previous draw. I know nothing of his draw by knowing the he did another draw. I cannot improve my EV greater than 0.58333.
Chance of each player winning is still 50/50.
If player 2 knows the outcome of player 1's hand, everything changes. But player 2 only knows the best possible EV that player 1 can achieve and nothing else.
Nope.
It should be intuitive that player 2 has a positional advantage. He at least knows if player 1 has a random card or a card worth of standing on before making his decision.
SOOPOO's answer was not far off. The number at which player 2 should need to stand pat is greater than the point at which 1 should stand pat. This should be intuitive, because if player 1 stands, then he is giving away that he has a good card. Player 2 will need to be aggressive to beat it.
I solved this problem incrementally. At first, Player 1 seems like they should hit any number less than 0.5 and stand on greater than 0.5 (what they do at exactly 0.5 doesn't matter). So player 1's average score when hitting is 0.5 and when standing is 0.75.
Knowing the info above, player 2 can tailor their strategy. When player 1 hits, clearly player 2 should hit any number less than 0.5. But when player 1 stands, player 2 can decide whether or not to hit. If y is player 2's current score, then if player 2 hits, they will win 1 - 0.75 = 0.25, and if they stand, they will win (y - 0.5)/2. For hitting to be better than standing, solve for y in: 0.25 > (y - 0.5)/0.5. So y < 0.625. So player 2 should hit on any number less than 0.625 when player 1 stands.
Now knowing this, can player 1 do better than hitting only numbers less than 0.5? If x is player 1's score, then if player 1 hits, they will have an average score of 0.5 and player 2 will hit all numbers less than 0.5 for a win amount for player 1 of 0.25, and when player 2 has a score greater than 0.5 they will stand and player 1 will win integral[0.5, 1](1 - x) = 0.125 for a total of 0.375. If player 1 stands on x (assuming x < y since clearly if x > y then player 1 will stand), then player 2 will hit when y < 0.625 and player 1 will win 0.625x and player 2 will stand when y > 0.625 and player 1 will never win for a total of 0.625x. So in order for hitting to be better than standing, 0.375 > 0.625x so x < 0.6. So player 1 should hit all numbers less than 0.6.
Continuing this process you can see the formulas for the new x and y pivot values.
For y, the formula is y = (1-x)/0.25 + x or simply y = (1 + 3x)/4
For x, the formula is x = 0.375/y or simply 3/(8y)
Combining these two formula together to figure out the equilibrium:
y = (1 + 9/(8y))/4
y = 1/4 + 9/(32y)
(y - 1/4)(32y) - 9 = 0
32y^2 - 8y - 9 = 0
Solving for y yields one positive solution: y = (1 + sqrt(19)) / 8 = 0.669862368...
So x = 0.375/y = 3 / (1 + sqrt(19)) = 0.559816491...
So Player 1 hits all scores less than 0.559816491... and player 2 hits all scores less than 0.669862368... if player 1 stands and all numbers less than 0.5 otherwise. To figure out the EV of player 1, consider all cases:
Player 1 hits (probability = 0.559816491...) and Player 2 hits (probability = 0.5): result = 0.559816491 * 0.5 * 0.5 = 0.139954123...
Player 1 hits (probability = 0.559816491...) and Player 2 stands (probability = 0.5): result = 0.559816491 * 0.5 * 0.375 = 0.104965592...
Player 1 stands (probability = 1 - 0.559816491...) and Player 2 stands (probability = 1 - 0.669862368): result = 0
Player 1 stands (probability = 1 - 0.559816491...) and Player 2 hits (probability = 0.669862368): result = (1 - 0.559816491) * 0.669862368 * (1 + 0.559816491) / 2 = 0.229965592...
For a grand total of 0.474885307... = about 47.49% payback for player 1.
Whew, I sure hope this was right...
Quote: ImAllInNowSolution:
I solved this problem incrementally. At first, Player 1 seems like they should hit any number less than 0.5 and stand on greater than 0.5 (what they do at exactly 0.5 doesn't matter). So player 1's average score when hitting is 0.5 and when standing is 0.75.
Knowing the info above, player 2 can tailor their strategy. When player 1 hits, clearly player 2 should hit any number less than 0.5. But when player 1 stands, player 2 can decide whether or not to hit. If y is player 2's current score, then if player 2 hits, they will win 1 - 0.75 = 0.25, and if they stand, they will win (y - 0.5)/2. For hitting to be better than standing, solve for y in: 0.25 > (y - 0.5)/0.5. So y < 0.625. So player 2 should hit on any number less than 0.625 when player 1 stands.
Now knowing this, can player 1 do better than hitting only numbers less than 0.5? If x is player 1's score, then if player 1 hits, they will have an average score of 0.5 and player 2 will hit all numbers less than 0.5 for a win amount for player 1 of 0.25, and when player 2 has a score greater than 0.5 they will stand and player 1 will win integral[0.5, 1](1 - x) = 0.125 for a total of 0.375. If player 1 stands on x (assuming x < y since clearly if x > y then player 1 will stand), then player 2 will hit when y < 0.625 and player 1 will win 0.625x and player 2 will stand when y > 0.625 and player 1 will never win for a total of 0.625x. So in order for hitting to be better than standing, 0.375 > 0.625x so x < 0.6. So player 1 should hit all numbers less than 0.6.
Continuing this process you can see the formulas for the new x and y pivot values.
For y, the formula is y = (1-x)/0.25 + x or simply y = (1 + 3x)/4
For x, the formula is x = 0.375/y or simply 3/(8y)
Combining these two formula together to figure out the equilibrium:
y = (1 + 9/(8y))/4
y = 1/4 + 9/(32y)
(y - 1/4)(32y) - 9 = 0
32y^2 - 8y - 9 = 0
Solving for y yields one positive solution: y = (1 + sqrt(19)) / 8 = 0.669862368...
So x = 0.375/y = 3 / (1 + sqrt(19)) = 0.559816491...
So Player 1 hits all scores less than 0.559816491... and player 2 hits all scores less than 0.669862368... if player 1 stands and all numbers less than 0.5 otherwise. To figure out the EV of player 1, consider all cases:
Player 1 hits (probability = 0.559816491...) and Player 2 hits (probability = 0.5): result = 0.559816491 * 0.5 * 0.5 = 0.139954123...
Player 1 hits (probability = 0.559816491...) and Player 2 stands (probability = 0.5): result = 0.559816491 * 0.5 * 0.375 = 0.104965592...
Player 1 stands (probability = 1 - 0.559816491...) and Player 2 stands (probability = 1 - 0.669862368): result = 0
Player 1 stands (probability = 1 - 0.559816491...) and Player 2 hits (probability = 0.669862368): result = (1 - 0.559816491) * 0.669862368 * (1 + 0.559816491) / 2 = 0.229965592...
For a grand total of 0.474885307... = about 47.49% payback for player 1.
Whew, I sure hope this was right...
Good first post. You're closest yet, but still not exact. Player 1 can improve his chances of winning, assuming an optimal player 2 counter strategy, by at least 1%.
I would also prefer to see a solution that didn't rely on recursion or trial an error. There is a way to get directly at an exact form of the answer.
Let's say you have .45. I think if you had somewhere around .4 to .5 your initial thought would be to switch, but by staying, you're showing you have something between .5 and 1. This would make person #2 think you had an average of .75 if they assume you're switching under .5.
So if you can get that person who has .65 to switch, then instead of your .45 having a 35% chance of winning by switching and them staying, you now have a 45% chance of winning by you staying and them switching.
Thoughts?
Quote: FinsRuleHave we decided if bluffing can be effective?
Let's say you have .45. I think if you had somewhere around .4 to .5 your initial thought would be to switch, but by staying, you're showing you have something between .5 and 1. This would make person #2 think you had an average of .75 if they assume you're switching under .5.
So if you can get that person who has .65 to switch, then instead of your .45 having a 35% chance of winning by switching and them staying, you now have a 45% chance of winning by you staying and them switching.
Thoughts?
This is what I was thinking too.
If we assume that a pure strategy is optimal, I'm pretty sure that I can figure it out, but it's not clear to me that it is.
Quote: FinsRuleHave we decided if bluffing can be effective?
I would argue bluffing is not effective, since folding is not an element to the game.
Quote: WizardI would argue bluffing is not effective, since folding is not an element to the game.
I don't think folding is relevant. I think FinsRule's point is, what if Player A has, say, 0.2, but wants Player B to think he has a larger number so B would be more likely to replace what would be a winning number? (This assumes there is a range where B's strategy is, "Redraw if A keeps, and keep if A redraws." Bluffing may also be effective if B's strategy has a range of "Keep if A keeps, and redraw if A redraws.")
Quote: ThatDonGuyI don't think folding is relevant. I think FinsRule's point is, what if Player A has, say, 0.2, but wants Player B to think he has a larger number so B would be more likely to replace what would be a winning number? (This assumes there is a range where B's strategy is, "Redraw if A keeps, and keep if A redraws." Bluffing may also be effective if B's strategy has a range of "Keep if A keeps, and redraw if A redraws.")
Yes, but I think the instances you bluff would only be if you had a middle type number like .35 - .5. (.35 might be a bit too low). It's great if you can trick someone higher than you to redraw, but if they have a 75% or higher of just beating you again, you should have been the one to redraw.
Quote: FinsRuleYes, but I think the instances you bluff would only be if you had a middle type number like .35 - .5. (.35 might be a bit too low). It's great if you can trick someone higher than you to redraw, but if they have a 75% or higher of just beating you again, you should have been the one to redraw.
Also, all the times that player 2 has a score lower than .5, they now have a better chance to beat player 1 because he stood on a low number instead of drawing.
Quote: Wizard
Good first post. You're closest yet, but still not exact. Player 1 can improve his chances of winning, assuming an optimal player 2 counter strategy, by at least 1%.
I would also prefer to see a solution that didn't rely on recursion or trial an error. There is a way to get directly at an exact form of the answer.
I'm not sure what you mean by recursion or trial and error. My solution started out as a back and forth, but by solving the quadratic that I provided, that should be the exact optimal equilibrium solution. Unless I made a mistake in the math somewhere. Do you dispute either of my update equations for how player 1 should respond to player 2's new strategy and vice verca?
Quote: ImAllInNowI'm not sure what you mean by recursion or trial and error. My solution started out as a back and forth, but by solving the quadratic that I provided, that should be the exact optimal equilibrium solution. Unless I made a mistake in the math somewhere. Do you dispute either of my update equations for how player 1 should respond to player 2's new strategy and vice verca?
I don't dispute the method but do dispute the final answer. I think you made a mistake in the math somewhere.
Quote: WizardI don't dispute the method but do dispute the final answer. I think you made a mistake in the math somewhere.
I looked back over my update equations and think I found an error. The equation I used to update player 2's strategy was wrong.
If player 1 stands on scores greater than, say, x, then their average score when standing is (1 + x) / 2.
So if player 2 hits, they will win 1 - (1 + x) / 2 = (1 - x) / 2.
If player 2 stands on a value y, they will win the percentage of times that player 1 stood on a number worse than y, which is (y - x) / (1 - x). (this is where I made a mistake. Originally I had the denominator as 0.5, which is only correct for the first time player 2's strategy is updated).
So the equilibrium point is when the two bold terms are equal, which is when (1 - x) / 2 = (y - x) / (1 - x). Solving this for y yields: y = (1 + x^2) / 2.
The equation for updating x remains as x = 3 / (8y).
Combining these equations together yields: 128y^3 - 64y^2 - 9 = 0. There is one solution for y which is 0.660951... and the corresponding x is 0.567364...
The resulting probability of winning for player 1 is then 0.4723166... = ~47.23%
I'm guessing there is some other way to solve this problem that yields a different result?
Quote: ImAllInNow
I looked back over my update equations and think I found an error. The equation I used to update player 2's strategy was wrong.
If player 1 stands on scores greater than, say, x, then their average score when standing is (1 + x) / 2.
So if player 2 hits, they will win 1 - (1 + x) / 2 = (1 - x) / 2.
If player 2 stands on a value y, they will win the percentage of times that player 1 stood on a number worse than y, which is (y - x) / (1 - x). (this is where I made a mistake. Originally I had the denominator as 0.5, which is only correct for the first time player 2's strategy is updated).
So the equilibrium point is when the two bold terms are equal, which is when (1 - x) / 2 = (y - x) / (1 - x). Solving this for y yields: y = (1 + x^2) / 2.
The equation for updating x remains as x = 3 / (8y).
Combining these equations together yields: 128y^3 - 64y^2 - 9 = 0. There is one solution for y which is 0.660951... and the corresponding x is 0.567364...
The resulting payback for player 1 is then 0.4723166... = ~47.23%
I'm guessing there is some other way to solve this problem that yields a different result?
Ding! Ding! Ding! Your answers for the two strategy bend points are correct.
However, I disagree on the "payback for player 1," which I assume you mean the probability of winning. I get a figure a little higher, which was confirmed by the source who gave me the problem.
For those who just want to see the strategy answers, with no solution, see the spoiler below.
Player 1 should switch with 0.567364 or less.
If player 1 stands, then player 2 should switch with 0.660951 or less.
So in order to figure P1's chance of winning, the 4 cases are:
P1 hits, P2 hits
-----------------
Probability of occurrence: 0.567364227 * 0.5
Chance of winning: 0.5
Contribution to total: 0.567364227 * 0.5 * 0.5 = 0.141841057
P1 hits, P2 stands
--------------------
While using integrals will also work here, we can assume that when P2 stands, they have an average of 0.75. So P1's chance of winning is 1 - 0.75 = 0.25.
Probability of occurrence: 0.567364227 * 0.5
Chance of winning: 0.25
Contribution to total: 0.567364227 * 0.5 * 0.25 = 0.070920528
P1 stands, P2 hits
---------------------
This can also be done without integrals since P1's average score is (1 + 0.567364227) / 2 = 0.783682 which is the same as their chance of winning if P2 hits.
Probability of occurrence: (1 - 0.567364227) * 0.660951083
Chance of winning: 0.783682
Contribution to total: (1 - 0.567364227) * 0.660951083 * 0.783682 = 0.224094717
P1 stands, P2 stands
------------------------
I definitely made an error here. I was thinking that if both players stand, then P2 always wins, which is clearly not correct. When P1 stands between the two pivot points and P2 stands, P2 always wins. But when P1 stands on a number higher than both pivot points, then P1 will win, on average, half the time that both players stand.
Probability of occurrence: (1 - 0.567364227) * (1 - 0.660951083)
Chance of winning: 0.5 * (1 - 0.660951083) / (1 - 0.567364227) = 0.391841
Contribution to total: (1 - 0.567364227) * (1 - 0.660951083) * 0.391841 = 0.057477084
Grand Total = 0.141841057 + 0.070920528 + 0.224094717 + 0.057477084 = 49.43335%
Quote: ImAllInNowComputing P1's chance of winning
So in order to figure P1's chance of winning, the 4 cases are:
P1 hits, P2 hits
-----------------
Probability of occurrence: 0.567364227 * 0.5
Chance of winning: 0.5
Contribution to total: 0.567364227 * 0.5 * 0.5 = 0.141841057
P1 hits, P2 stands
--------------------
While using integrals will also work here, we can assume that when P2 stands, they have an average of 0.75. So P1's chance of winning is 1 - 0.75 = 0.25.
Probability of occurrence: 0.567364227 * 0.5
Chance of winning: 0.25
Contribution to total: 0.567364227 * 0.5 * 0.25 = 0.070920528
P1 stands, P2 hits
---------------------
This can also be done without integrals since P1's average score is (1 + 0.567364227) / 2 = 0.783682 which is the same as their chance of winning if P2 hits.
Probability of occurrence: (1 - 0.567364227) * 0.660951083
Chance of winning: 0.783682
Contribution to total: (1 - 0.567364227) * 0.660951083 * 0.783682 = 0.224094717
P1 stands, P2 stands
------------------------
I definitely made an error here. I was thinking that if both players stand, then P2 always wins, which is clearly not correct. When P1 stands between the two pivot points and P2 stands, P2 always wins. But when P1 stands on a number higher than both pivot points, then P1 will win, on average, half the time that both players stand.
Probability of occurrence: (1 - 0.567364227) * (1 - 0.660951083)
Chance of winning: 0.5 * (1 - 0.660951083) / (1 - 0.567364227) = 0.391841
Contribution to total: (1 - 0.567364227) * (1 - 0.660951083) * 0.391841 = 0.057477084
Grand Total = 0.141841057 + 0.070920528 + 0.224094717 + 0.057477084 = 49.43335%
Correct!!! Welcome to the forum! Hope you'll stick around.
I'd also like to give you a free copy of my book for being the first on the forum to solve the problem. Please PM me (send me a private message) with an address.
Here is a link to my solution. It is pretty short and just goes over the major steps.
Game 1: Each player is dealt a single card from their own personal 52 card deck and can decide to either replace the card, shuffle their deck, and draw another card, or to keep their card. As in the original problem, P2 gets to act last. Highest card wins (suits don't matter).
Game 2: Same as game 1 but suits are ranked from clubs (worst), diamonds, hearts, spades (best).
Game 3: Players are dealt 5-card poker hands instead of a single card. Best 5-card hand wins.
I calculated the formulas using the same methods as in the solution in this thread. The only real difference being that now there is a chance of a tie (a smaller and smaller chance as the games go from 1 to 3).
The results I got were basically in line with the percentages given in the original problem, which is to be expected I guess:
Game 1: P1 keeps all 9 and above, P2 keeps all J and above.
Game 2: P1 keeps all 9c or better, P2 keeps all 10h or better.
Game 3: P1 keeps 44762 or better, P2 keeps 66AQ2 or better.
The resource I used to calculate the 5 card poker hand was here. I had to add in a running tally of which rank of hands each equivalency class corresponds to based on how many different individual hands the class represents.
I didn't yet calculate P1s chance of winning in each game.