Player 1 is brought into the studio while the other contestants remain in isolation. She picks a card from a rack of 100 cards, and a scanner reveals the value of their card. She may keep the card, or choose another to try to improve. NOTE: This is NOT like the Showcase Showdown on the Price is Right; They do not add the numbers, but may simply replace the number they originally chose. This is then the secret target number to beat.
Player 2 is brought into the studio. She is not told the secret target number. She chooses a card, and its value is revealed. She may replace her number if she wishes. Whoever has the higher number continues, while the other player is eliminated.
Player 3 is brought into the studio, and the process repeats once more. Whoever has the higher number of the two wins the prize.
At what number should each player stop, or choose again?
Does any player have an advantage over another?
Initially the median value of the deck is 50.5.
Player 1 has a 50% chance of getting a number higher than 50. If she does, she should stop. If she does not, she should pick again because the majority of the remaining cards are better than her card.
If player 2 assumes player 1 followed proper strategy there is a 50% chance the deck has one high card removed leaving a median value of 50. The other 50% of the time it has two cards removed. The first removal was a low card leaving the median at 51. The second removal was random so we have no additional information so still must assume the median is 51. So half the time median is 50, half the time median is 51 so the second player follows the same strategy as the first.
Extending this logic, player 3 follows the same strategy. So no player has an advantage.
I have a feeling there is a hole in my logic somewhere. Let the big brains chime in.
Just 1 redraw allowed per person, right?
Quote: dwheatleyBecause player 1 knows they are competing against 2 people in the end, not just 1, they probably need to shoot higher than 50.
All three players are competing to be the highest of the three. The fact that player one and two face off first doesn't change anything. It would be the same if all three revealed their cards simultaneously.
Quote: jml24All three players are competing to be the highest of the three. The fact that player one and two face off first doesn't change anything. It would be the same if all three revealed their cards simultaneously.
I am not sure that is right. If we assume player one has drawn a card in the 50's with his first draw I think he should assume that it will be beaten. There are a possible 4 draws taking place after him the odds of none of these 4 having a value above 60 is quite small. With out allowing for the drawn cards my rusty math gives me .6x.6x.6x.6 = .1296. This logic is probably also true for the earlier players.
Quote: Dween
Player 3 is brought into the studio, and the process repeats once more.
Ambiguous wording. After one of the first two players is eliminated, does the winner start over again and compete with player 3 anew?
Or is the winning number selected in the first round by either player 1 or 2 the number player number 3 must beat?
Assuming it is the second scenario, then just shooting for a number over 50 is a losing approach.... if you get 51 on your first pull and stand your odds of winning against two competitors is small.
I redraw at 83 or below, stand at 84 or above. Seems about right.
There IS replacement of cards between players, even between draws of a replacement card. An inattentive contestant could feasibly pick the same number twice. In one video example, two players both chose the same card on their initial draw. I am assuming a tie is possible, and they would have a playoff.
The show is called Make Me Rich, a quarterly show from the Michigan Lottery.
Here is the video:
From the start (with contest intros)
From the instructions
Player 3 will be trying to beat the number previously drawn and held by player 1 or 2, whichever is higher. Each contestant gets one shot at picking a number.
Quote: kenarmanI am not sure that is right. If we assume player one has drawn a card in the 50's with his first draw I think he should assume that it will be beaten. There are a possible 4 draws taking place after him the odds of none of these 4 having a value above 60 is quite small. With out allowing for the drawn cards my rusty math gives me .6x.6x.6x.6 = .1296. This logic is probably also true for the earlier players.
I am not sure either, but player one is not certain to face four draws. What is certain is that when any player redraws after pulling a number greater than 50, he has a greater probability of lowering his number than raising it.
If no redraws are allowed, this is a simple game where each player has 1/3 probability of winning. I contend that a redraw only makes sense if it has a positive expectation of increasing your number.
Quote: jml24
If no redraws are allowed, this is a simple game where each player has 1/3 probability of winning. I contend that a redraw only makes sense if it has a positive expectation of increasing your number.
This might make sense if there was a prize for second place. Imagine if instead of 3 players there were 100 players. Would you still not re-draw if you got a 52? Use your mind to extrapolate and you will see that you will need to redraw at a number higher than 50, depending on the number of contestants. The exact math is above my abilities, but I like my guesstimation of not re-drawing at 84 or higher with 3 contestants.
.84 to the 4th power is around .5
Quote: SOOPOOI redraw at 83 or below, stand at 84 or above. Seems about right.
I agree with the spirit here thought I might hold at 78. The goal is to beat 2 other players, NOT to maximize the average value of the card you hold. For example, if you have a 52 you are not likely to beat 2 other players so you should redraw. Most of the time you will redraw a lower value card (and thus reduce the average value of the card you hold) but you will introduce variance and give yourself a shot at pulling something in the 80's or 90's where you have a decent shot to win.
EDIT: Great minds think alike, I should have read the whole thread before responding. Ditto to SOOPOO's above post.
I am going to try to run some simulations to see if I can get some idea of approximately where to start, and then maybe work on the math from there.
It doesn't matter.
I ran some quick simulations, with varying strategies for each player. No matter what strategy each player had, the wins were spread out evenly, 33% each (with slight deviations, but all in all very close).
Any explanation to that?
With no extra information and no change in the game state, it's starting to be pretty clear no player has an advantage.
Quote: DweenI think the answer is...
It doesn't matter.
I ran some quick simulations, with varying strategies for each player. No matter what strategy each player had, the wins were spread out evenly, 33% each (with slight deviations, but all in all very close).
Any explanation to that?
This can't be right. Certainly the strategy of holding 40 or less and drawing on all else (including 90+) would lose to the opposite strategy.
This is not order of play in effect, but simply common sense being used. With the more playing, it's basic to understand a higher score is needed,but the strategy would be the same for guy 1 and guy 100.Quote: bigfoot66Quote: DweenI think the answer is...
It doesn't matter.
I ran some quick simulations, with varying strategies for each player. No matter what strategy each player had, the wins were spread out evenly, 33% each (with slight deviations, but all in all very close).
Any explanation to that?
This can't be right. Certainly the strategy of holding 40 or less and drawing on all else (including 90+) would lose to the opposite strategy.
Quote: onenickelmiracleThis is not order of play in effect, but simply common sense being used. With the more playing, it's basic to understand a higher score is needed,but the strategy would be the same for guy 1 and guy 100.
But is it? I watched the clip. The last person will know if they are facing the first or second person. Heading to work now. More when I get home.
Assuming there is a tie for the lead, the winner will be randomly chosen between the leaders.
I'll work on that today if I am able.
The EV is 50/100 x 101/2 + 1/100 x 51 + 1/100 x 52 + ... + 1/100 x 100
= 101 / 4 + (51 + 52 + ... + 100) / 100
= 101 / 4 + (50 / 2 x (51 + 100)) / 100
= 101 / 4 + 7550 / 200
= 12600 / 200 = 63
And based on that...
1/100 * (1/200 * 1/2)
+ 1/100 * (1/200 + 1/200 * 1/2)
+ 1/100 * (2/200 + 1/200 * 1/2)
+ ...
+ 1/100 * (49/200 + 1/200 * 1/2)
+ 1/100 * (50/200 + 3/200 * 1/2)
+ 1/100 * (50/200 + 3/200 + 3/200 * 1/2)
+ 1/100 * (50/200 + 6/200 + 3/200 * 1/2)
+ ...
+ 1/100 * (50/200 + 144/200 + 3/200 * 1/2)
+ 1/100 * (50/200 + 147/200 + 3/200 * 1/2)
= 1/100 * (1/400 * 50 + 1/200 * (0 + 1 + 2 + ... + 49) + 1/4 * 50 + 3/400 * 50 + 3/200 * (0 + 1 + ... + 49))
= 1/100 * (50/400 + 1/400 * 2450 + 5000/400 + 150/400 + 3/400 * 2450)
= 1/100 * (2500/400 + 5000/400 + 150/400 + 7350/400)
= 15,000 / 40,000
= 3 / 8
For what values of P2 does the probability of (P2 > P1) > 3/8?
If P2 > 50, then the probability of P2 > P1 is
50/200 + (P2 - 51 + 1/2) * 3/200
= 1/200 * (50 + 3 P2 - 153 + 3/2)
= 1/200 * (3 P2 - 203/2)
This > 3/8 when 3 P2 - 203/2 > 1/200 * 3/8, or P2 > 353 / 6
Since P2 is an integer, P2 >= 59
Thus, Player 2 draws if his first number is 1-58 and keeps 59-100
...but wouldn't that chance Player 1's strategy, which, in turn, would change Player 2's, and so on - and we haven't even mentioned Player 3 yet?
I have a nagging feeling that there is some simple explanation being missed here. Something like, you have an x% chance of choosing a number that is greater than or equal to your stop limit, and there is a (100-x)% chance of improving if you have to choose again.
At the same time, it just seems silly to pick a number and be happy with it. No one should stop with a 1. What are we missing?
1. The aim is to be Top Gun, so what matters is the probability of holding the maximum value among the three. The expected value is not useful here.
2. With replacement at every step, there is no information to be gained for the second or third player, because there is no correlation, hence no backward induction (Bayesian adaptation). Consequently, there can be no advantage or disadvantage of going first or last.
Actually, this is similar to a simultaneous game. The same is true for the variant: draw on the 0-1 continuum.
(I am not saying that the strategies are the same. Only that there is no rank advantage, because no backward info.)
In all these cases, all three players apply the same strategy. The game is fair, in that every player has a 1/3 chance of winning. (Provided everyone plays optimally. If one player has knowledge that another one is using another strategy, then he/she can exploit it.)
3. The continuum game makes strategies computation less hard.
- Define both your opponents' switch level as X. It means that they redraw on a value less than X, stand if they got more than X. Compute the resulting prob distribution of the value of their cards.
- Define your switch level as Y. Also a prob distribution. From this you can compute the probability that you will end up with the maximum card (given Y and X).
- So you can find the level Y*(X) that maximizes your probability of winning given X. [Note: this is a star, not a multiplication sign.]
- To finish, find the X such that Y*(X) = X. (Since the optimal strategy X* must be the same for all three.)
I have found X* = 0.4622 (not nice: there is a 4th-order polynomial to solve)
This is a Nash equilibrium: if the opponents play that strategy, then your best bet is to play it. If they don't, well... play Y*(x) if you know their x's. But anyway, their play against your X* is suboptimal, so you know you have more than 1/3 chance of winning.
4. It is to be supposed that when the number of cards is sufficiently large, the replacement game is not very different from this one.
Question: is 100 "sufficiently large"?
5. I am busy on the replacement game with only six cards, to see the structure of the solution. Alas, even in this simplest of cases, it is hard to draw the game structure. WiP
- Do the players know whether the previous player(s) have redrawn?
- Or are they just informed about the number of cards remaining in the deck?
- Or do they know nothing?
Your optimal strategy depends on what the other players do, but you can't know whether they are using a sub-optimal strategy. So you can only play optimally [EDIT] against a random strategy [/EDIT], and regardless of whether others play optimally or not, it doesn't matter what order you go in.
If you play sub-optimally (like standing on 62 in 3rd position), you are screwing yourself, but you didn't have a lesser chance because you were in 3rd.
Since no player knows the results of the other and the cards are put back for all draws/redraws, would it look a little like this?
Assuming players redraw at 50 or less:
50% of getting between 51 and 100 on first draw and stopping.
50% of getting between 1 and 50 on first draw and redrawing a number between 1 and 100.
Calculates to:
50% of 75.5 + 50% of 50.5 = 63 would be the average result you'd need to beat.
I think that would be the magic number when playing head to head. I'm no math expert .. but I'm thinking that with 3 players the stats would be a little different.
50% of 2 tries at between 51 and 100 on first draw.
50% of 2 tries at between 1 and 50 on first draw and then redrawing a number between 1 and 100.
So .. (this is likely going to hurt my brains a bit .. lol)
50% of 2/3 of average from 51 to 100 (84.3)
50% of 2/3 of average from 1 to 100 (67.7)
Calculates to:
50% of 84.3 + 50% of 67.7 = 76
So I *THINK* the stats show that the average winning number with 3 players redrawing at under 51 would be "76"? But that would include you .. not actually be the average number you'd need to beat (which I guess would be the winner of 2 .. back to the 63).
Remember you're trying to increase your chance of winning to above 33.33% .. not above 50% .. that's probably the biggest brain tease that is over looked.
So stand at 64 and redraw at 63 is I think is what you'd want to do to increase your odds of winning above 33.3% .. but only realistically expect to win with a 76 or above?
*Goes to wipe brains off ceiling* .. lol
37% of getting between 64 and 100 on first draw and stopping.
63% of getting between 1 and 63 on first draw and redrawing a number between 1 and 100.
Calculates to:
37% of 82 + 63% of 50.5 = 62.1 would be the average result.
Yeah .. weird .. but not when you actually think about it .. lol .. if others redraw threshold is higher then chances are they will end up with a lower number. But again .. I'm not sure if that's the way it would work with 3 people involved?
38% of 2 tries at between 63 and 100 on first draw.
62% of 2 tries at between 1 and 62 on first draw and then redrawing a number between 1 and 100.
So .. (this is likely going to hurt my brains a bit .. lol)
38% of 2/3 of average from 64 to 100 (88)
62% of 2/3 of average from 1 to 100 (67.7)
Calculates to:
38% of 88 + 62% of 67.7 = 75.4 average winning number with 3 players INCLUDING you if people redraw below 63.
Sorry, but don't you contradict your beginning sentence (probabilities) when calculating expectations?Quote: MrLeftIsn't this essentially a question of calculating probabilities?
(...)
Assuming players redraw at 50 or less:
50% of getting between 51 and 100 on first draw and stopping.
50% of getting between 1 and 50 on first draw and redrawing a number between 1 and 100.
Calculates to:
50% of 75.5 + 50% of 50.5 = 63 would be the average result you'd need to beat.
What we have "essentially" is a Bayesian-Nash game theory situation.
And as there is one winner and two losers, this is a zero-sum game, which means the aim is not expected value but probability of winning.
Where does the "redraw at 50" assumption come from?
Isn't it contradicted by the 63 result?
Assume your strategy is to stop at 50 or more.
If you stay, you have a 50% chance of having an average score of 75.
If you redraw, you have a 50% chance of having an average score of 50.
Therefore, you will have an average score of 62.5
(0.5*75)+(0.5*50)
Assume your strategy is to stop at 80 or more.
If you stay, you have a 20% chance of having an average score of 90.
If you redraw, you have a 80% chance of having an average score of 50.
Therefore, you will have an average score of 58
(0.2*90)+(0.8*50)
Assume your strategy is to stop at 20 or more.
If you stay, you have a 80% chance of having an average score of 60.
If you redraw, you have a 20% chance of having an average score of 50.
Therefore, you will have an average score of 58
(0.8*60)+(0.2*50)
Seems like we have a bell-curve of data here, with a peak at 50. If this is true, why did my simulation show that any strategy vs. any strategy is an evenly matched game? I may have to rewrite my program from scratch to ensure I didn't mess up somewhere.
It seems counter-intuitive that stopping at 50 will result in an average score of 62.5, and yet a strategy that stops at 62.5 ends up with an average score of about 61. Is my math sound in the above examples?
Why do you calculate average scores?
Quote: kubikulannSorry, but don't you contradict your beginning sentence (probabilities) when calculating expectations?
What we have "essentially" is a Bayesian-Nash game theory situation.
And as there is one winner and two losers, this is a zero-sum game, which means the aim is not expected value but probability of winning.
Where does the "redraw at 50" assumption come from?
Isn't it contradicted by the 63 result?
Expected value was just a bonus! ;) ... As I said .. the real goal is to up your chances of winning to above 33.33% .. otherwise it's just dumb luck.
The 50 was an initial look at the basic cut to have a chance improve your score.
63 is essentially a different look at comparing a head to head average (which you'd theoritically need to beat since you are a third player.
lol ..The logic kinda still works in my head .. a little less in writing I guess! ;)
Since I have no clue who either Bayesian or Nash are, I'm not saying you're wrong .. this was just my best guess as a non PhD in Math / Rocket Science! ;)
Quote: kubikulannDialogue between deafs?
Why do you calculate average scores?
Yes, average scores is almost certainly a red herring. You want to maximize your probability of winning.
Consider the 3-card case, with 1,2,3. The only choice is to hit or stay the 2. If you stay, your probability is 1/9, 4/9, 4/9 for the 3 outcomes. If you hit, it is 2/9, 2/9, 5/9. The average score and expected value are the same, but it doesn't take much work to show that Hitting increases your probability of winning against 2 people playing any combination of the strategies.
And it doesn't matter what order you are in, you all have an equal chance of winning if you use sound strategies.
Quote: kubikulannWhy do you calculate average scores?
The average winning score between two contestants is the number you have to aim to beat if you want to win more than the 33.3% average of the time as a third participant (the order of play doesn't really matter in any way .. you could be first to pick but you still need to beat the average of the 2 other players)?
That being said .. if all 3 players end up with the same strategy (regardless of pulling at 50 or 63 or 84) .. then they have equal chances of winning!
(I'm pretty sure .. lol .. unless one of them is a psychic) ;)
Let X be the number up to which players redraw (and they stand with a value > X). Let x be X/100 (i.e. the percentage value).
Let k be the value of the final card obtained. Let f(k|X) be the probability distribution of k given strategy X. Let F(k|X) be the cumulative.
(1+x)/100 ... if k>X
F(k|X) = (x/100) k ...if k=1 to X
( (1+x)/100 )k - x ...if k>X
The probability that both opponents are lower or equal to your value k is F2(k|X).
(I assume that you win in case of a tie. Adapt to F2(k-1|X) if a tie is a loss.)
Let Y be the optimal strategy number, given that the opponents use X. You are trying to maximize the expected F2:
= SUM1 to Y { (y/100) F2 } + SUMY+1 to 100 { ((1+y)/100) F2 }
= (y/100) SUM1 to 100 { F2 } + (1/100) SUMY+1 to 100 { F2 }
Those differences are
and
(1/100)2 SUM1 to 100 { F2 } - (1/100) F2(Y+1|X) < 0
In other words, your Y should be the card value whose F² is the average of all F².
Expanding
SUM1 to 100 { F2 } = SUM1 to X { (kx/100)2 } + SUMX+1 to 100 { (k(1+x)/100 - x)2 }
yields a 4th-power polynomial in X.
The optimal Y|X is hard to express in a formula.
But we don't bother, since we know that in the equilibrium Y(X*) is equal to X*. So we simply have to solve a polynomial
Note the small x's (= X/100)
(unless I made a calculation mistake).
Quote: DweenNumbers are simplified slightly in the below example.
Assume your strategy is to stop at 50 or more.
If you stay, you have a 50% chance of having an average score of 75.
If you redraw, you have a 50% chance of having an average score of 50.
Therefore, you will have an average score of 62.5
(0.5*75)+(0.5*50)
Assume your strategy is to stop at 80 or more.
If you stay, you have a 20% chance of having an average score of 90.
If you redraw, you have a 80% chance of having an average score of 50.
Therefore, you will have an average score of 58
(0.2*90)+(0.8*50)
Assume your strategy is to stop at 20 or more.
If you stay, you have a 80% chance of having an average score of 60.
If you redraw, you have a 20% chance of having an average score of 50.
Therefore, you will have an average score of 58
(0.8*60)+(0.2*50)
Seems like we have a bell-curve of data here, with a peak at 50. If this is true, why did my simulation show that any strategy vs. any strategy is an evenly matched game? I may have to rewrite my program from scratch to ensure I didn't mess up somewhere.
It seems counter-intuitive that stopping at 50 will result in an average score of 62.5, and yet a strategy that stops at 62.5 ends up with an average score of about 61. Is my math sound in the above examples?
Nope .. that's correct .. because after your first pick of anything between 51 and 62 you are actually "resetting" those above average picks with a new 2nd pick where your average would be 50! ;)
It's weird for the brain to process .. but that's because the mind tends to tackle this problem in wanting to try to win 51% of the time ... but you only need to get to 34% to "improve" your odds over 33.333%
Quote: kubikulann... solution is X* = 69,3
(unless I made a calculation mistake).
Ummm .. even if I only got X% of that .. I have to agree that *69* is often a good solution! ;)
This game is confusing, so in order to show that this is not correct, let me imagine a totally different situation.Quote: MrLeftThe average winning score between two contestants is the number you have to aim to beat
Player 1, you know it, has a 33.33% chance of having 1 and 66.66% of having 100. Consequently, his average is 67.
Player 2 can have values of 0, 50, 80 or 101. Typically you see that situations 50 and 80 are totally identical for him. Yet one is above 67 and the other is below.
So the "aim" is not 67.
Quote: kubikulannSo the Nash equilibrium is to redraw if you get 1 to 69, and stand with your card on a value of 70 to 100.
Cool. Could be a part of a paper, I'm sure. A quick simulation should verify the result.
Also nice font colour.
When writing/editing, just click on "formatting codes" below ;-)Quote: dwheatleyAlso nice font colour.
NASH EQUILIBRIUM : set of strategies where each player acts optimally given the strategy followed by the others.Quote: MrLeftSince I have no clue who either Bayesian or Nash are, I'm not saying you're wrong .. this was just my best guess as a non PhD in Math / Rocket Science! ;)
BAYESIAN : game where uncertainty is present, and players have to adapt their probab calculations based on what information they see. (E.g. in a game without replacement, drawing a 5 means the previous player has not drawn a 5 so you reshape the probab distributions accordingly.)
John Nash, (pseudo-)Nobel Prize in economics for this concept he created in 1951. See film "A Beautiful Mind", starring Russel Crowe and Jennifer Connelly.
Rev. Bayes, 18th century philosopher, credited for promoting the use of so-called Bayesian Formula for updating probabilities after new info.
Quote: kubikulannThis game is confusing, so in order to show that this is not correct, let me imagine a totally different situation.
Player 1, you know it, has a 33.33% chance of having 1 and 66.66% of having 100. Consequently, his average is 67.
Player 2 can have values of 0, 50, 80 or 101. Typically you see that situations 50 and 80 are totally identical for him. Yet one is above 67 and the other is below.
So the "aim" is not 67.
Yeah .. and I didn't get 67 .. I got 63 .. (not as nice as 69) lol .. and I'm not saying I'm right .. but this was my approach:
The aim is not to beat the average .. but to have an above 33.3% chance of winning. So you need to find the statistical average win between 2 players in a 2 player game .. and try tie or beat that score more than 33.3% of the time (even a tie would be favourable because then it would be a 2-way race instead of 3-way thus shifting your probabilities towards 50% from 33.3%).
In that effort, let me propose a simpler two-player problem as follows:
1. Player A is given a random number uniformly distributed between 0 and 1.
2. Player A may keep this number or trade it for another number drawn in the same way.
3. Player B repeats steps 1 and 2, without knowing anything about what happened with player A.
4. The higher score wins.
I submit that both players should stick on any first number > x. The question is, what is x?
I think there is an advantage to going third. You only need to beat one player. The first player has to beat two others.
Quote: AyecarumbaDo the later players have any information on the previously drawn number(s). Do they get to see the other contestants after their numbers were revealed, or is there a chance the audience's reaction could reveal helpful information?
I think there is an advantage to going third. You only need to beat one player. The first player has to beat two others.
You can watch this video for the rules. It would seem that the players know nothing about what happened with the previous players. The audience is taken out of the studio while the first two players have their turn, and are taken back in after the third player picks his first card.
It should also be noted that that rack of cards always seems to be full, indicating that old cards are put back in. We can clearly see that both players 1 and 3 got the 62 card.
I assert there is no advantage to going first, second, or third.
BTW, somebody said in the comments of the video that the host looks like Donnie Osmond. However, I think he looks like Christopher Knight (Peter Brady).
Edit: and the guy does look like Peter Brady
Quote: AyecarumbaThe previous contestant is standing with the host when your first choice is revealed. I assert that there is an advantage to reading their reactions, so going third is best.
I don't disagree, but I think you're confusing the issue. I think we're going for a math-based optimal strategy.
You must assume they will either play optimal strategy, something similar to optimal, or totally random. I think these will determine how aggressive you need to be.
At minimum, is is safe to assume they would exchange their number if it is 49 or below.
Based on how the numbers are arranged on the display, it is safe to assume they would not choose the same number twice.
Therefore, you must assume that their number is at least 50, since they have a slightly better chance of pulling 50 - 100 (51%) than 1-49 (48%), seeing as they know where one of the low numbers is located on the grid.
What are the odds of the player prior to you pulled 76-100? .252 But that also means they had a .747 chance of pulling something 75 or below.
If you pull anything below 50, you must pull again (and be sure to not pull the same card again). You have the same opportunity as the player prior to you. There is no advantage under these conditions.
If you pull 74 in the first round, there are only 25 numbers better, and 74 worse. I think a better "max" number that would trigger a re-draw would be 65. You would have 34 chances to better your position, and only 65 to do worse. 1-3 or 1-2? If it were flipping coins, I like my chances of getting the only "heads" vs. two other opponents; rather than three other opponents.
In this lottery game, stopping on a number higher than 50 is the right strategy. While the average score of someone who stops at 50+ turns out to be ~62.5, there is no other number than can maximize the play. Stopping at 63+ garners an average of ~61.
Therefore, all players should stop on any number over 50.
Quote: DweenIn this lottery game, stopping on a number higher than 50 is the right strategy.
So, you would stick at 51, despite having to face two opponents that get two chances each. Would you care to back that up with any math?