I hope you are all fine and I wish you all to stay Negative (Covid)

I apologize in advance for my English

My idea is a game with no House edge and no Players edge

the game is actually well known as it is the Monty Hall 3 Doors game.

As we all know (hopefully) that the best the player can do is to switch after player chose a door and then is asked if he wants to switch doors after one door with a goat is opened and the player is left with 2 doors.

By switching the player will win 2 out of 3 times. Therefore I want to pay for example $100 for a bet of $200. This should be a no House edge and no Players edge game.

My question now is what would be the correct/best min and max bets for a given Bank Roll to stay in the game in the long run as the Game provider?

Please ask if I could not make myself understandable enough

Cheers and stay Negative

Quote:AlmondBreadIn the long run, the game provider will go broke regardless of betting limits unless they have a replenishing bankroll (since the bettors collectively do). This is the nature of 0ev games. If one side has a finite bankroll and the other doesn't, the finite bankroll is doomed. If both are finite, the probability of ruin is the ratio of one's bankroll to the sum of each bankroll.

link to original post

thank you very much for taking the time to answer my question, very much appreciated. to be frank I understand your answer and that is exactly what I was thinking but I thought that there would be a solution with min and max allowed bets. isn't there a solution with a big enough Bank Roll in relation to the min and max bets?

cheers and Stay safe

With an infinite bankroll, you'd eventually reach every positive and negative distance from EV, and you'd do this infinitely many times. With a finite bankroll, the behavior is the same until you reach the negative distance corresponding to zero bankroll, at which point you're stuck forever.

From the wikipedia on random walks:

Your scenario isn't a 1:1 coinflip but the same thing applies.Quote:An elementary example of a random walk is the random walk on the integer number line, ℤ, which starts at 0 and at each step moves +1 or −1 with equal probability.

This walk can be illustrated as follows. A marker is placed at zero on the number line, and a fair coin is flipped. If it lands on heads, the marker is moved one unit to the right. If it lands on tails, the marker is moved one unit to the left.

...

How many times will a random walk cross a boundary line if permitted to continue walking forever? A simple random walk on ℤ will cross every point an infinite number of times. This result has many names: the level-crossing phenomenon, recurrence or the gambler's ruin. The reason for the last name is as follows: a gambler with a finite amount of money will eventually lose when playing a fair game against a bank with an infinite amount of money. The gambler's money will perform a random walk, and it will reach zero at some point, and the game will be over.

Quote:AlmondBreadIn this scenario, the only thing you can control with betting limits is the average time it will take to go broke.

With an infinite bankroll, you'd eventually reach every positive and negative distance from EV, and you'd do this infinitely many times. With a finite bankroll, the behavior is the same until you reach the negative distance corresponding to zero bankroll, at which point you're stuck forever.

From the wikipedia on random walks:Your scenario isn't a 1:1 coinflip but the same thing applies.Quote:An elementary example of a random walk is the random walk on the integer number line, ℤ, which starts at 0 and at each step moves +1 or −1 with equal probability.

This walk can be illustrated as follows. A marker is placed at zero on the number line, and a fair coin is flipped. If it lands on heads, the marker is moved one unit to the right. If it lands on tails, the marker is moved one unit to the left.

...

How many times will a random walk cross a boundary line if permitted to continue walking forever? A simple random walk on ℤ will cross every point an infinite number of times. This result has many names: the level-crossing phenomenon, recurrence or the gambler's ruin. The reason for the last name is as follows: a gambler with a finite amount of money will eventually lose when playing a fair game against a bank with an infinite amount of money. The gambler's money will perform a random walk, and it will reach zero at some point, and the game will be over.link to original post

thanks again

I am not sure if I understood it a 100% so let me ask you

would it be ok to have a min of $1 and a max of $10 with a Bank Roll of $100k

Quote:seven

thanks again

I am not sure if I understood it a 100% so let me ask you

would it be ok to have a min of $1 and a max of $10 with a Bank Roll of $100klink to original post

You can do what you like.

If you do that against an unlimited pool of opponents with replenishable bankrolls, expect to go broke.

I took the lazy approach and coded a sim. If $10 means risking $10 to win $5, then the house's bankroll is 20,000 units and it will take several billion wagers for the house to go broke. If $10 means risking $20 to win $10, then the house will go broke in a few hundred million wagers. I didn't run a proper sample of sims though, so if you want accurate figures I'll either run more or actually calculate it.

So IRL you can probably get away with this since you'll be dead before you have a chance to go broke from it. If you won't even see 20k wagers then you're 50/50 to finish ahead, otherwise still close to 50/50.

Quote:DieterQuote:seven

thanks again

I am not sure if I understood it a 100% so let me ask you

would it be ok to have a min of $1 and a max of $10 with a Bank Roll of $100klink to original post

You can do what you like.

If you do that against an unlimited pool of opponents with replenishable bankrolls, expect to go broke.link to original post

@Dieter & @AlmondBread

thank you for your great help! how about adding now the option for the players to choose to stay instead of switching (like the original game)? how would the chances be if I pay the player who stays for example for a $10 wager $15 = 1 1/2 of his wager?

for better understanding

I pay 1/2 of the player's wager in case the player switches, means wager $10 gets $5 profit in case he wins

I pay 1 1/2 of the player's wager in case the player stays, means wager $10 gets $15 profit in case he wins

I hope I did not confuse you. thanks for your patience

cheers

However, if you make both options fair (ie pay out 2:1 on staying) and have multiple people placing simultaneous bets, that will reduce your risk because some of them will just be betting against each other. But it also reduces your action, so if action is the reason for running this operation then you'd wanna compensate by increasing the limits.

Btw I'm not sure if you saw my post before this one because I may have submitted it as you were typing yours.

Quote:AlmondBreadThen it's a question of how many people will take that -EV option. Word would probably spread that it's a sucker bet while the other one is perfectly fair. So if I were the house setting the limits, I'd assume the worst case scenario, namely that they'll always bet the max on the good option.

However, if you make both options fair (ie pay out 2:1 on staying) and have multiple people placing simultaneous bets, that will reduce your risk because some of them will just be betting against each other. But it also reduces your action, so if action is the reason for running this operation then you'd wanna compensate by increasing the limits.

Btw I'm not sure if you saw my post before this one because I may have submitted it as you were typing yours.link to original post

all in all if I understand you guys correctly I just need to forget this game idea and will think of a new one.

thank you very much for the great help and I think I saved time and money with your help

cheers and Stay Negative