October 20th, 2015 at 3:11:13 AM
permalink
What is the process for calculating such house advantages? This problem seems non-trivial to solve and hours of trawling through math books and various books on probability but google and the local library are turning up blanks. While I'm currently working on my undergraduate degree in math (Currently in junior year of college) I don't know exactly how I would represent the problem in the first place.
Important examples include: Blackjack with perfect card counting, and the various poker variants.
There are a large variety of issues to solve
1.) a given bet can be -EV but it would reduce the house edge to bet the maximum (for instance making a bet with 1% house edge in a game that normally would have 3% house edge)
2.) handling varying bet sizes depending on the house edge is tricky, I would assume a player would make 1 of 2 bets, Bet 1 is the minimum or X, bet 2 is the maximum bet size or N*X. Depending on the value of N house advantage could shift quite wildly in a game. This is most relevant in blackjack where betting is a continous process.
3.) How do you handle bets with even odds? This is similar to 1, but if a bet has even odds a player would be better off betting the maximum if the game is not beatable or the minimum if it is.
When trying to work on this problem I resolved by making the following assumptions
1. At first assume the game is beatable in the long run.
2. Players play to maximise Money EV, not proportional edge, so if a bet has any negative house advantage the player bets the maximum, and if the bet has neutral house edge or a house edge bet the minimum.
Is it ever rational to bet any amount that isn't the largest value or the smallest possible value if so what case and why?
Instead of making real blackjack I'm starting with simulated non independent trials game, the rules are simple (so probabilities can be calculated)
Players may place a bet of $1 or $4 (call the second bet's value X)
The Casino then spins a 38 number roulette wheel, if the number is 1-16 the players win otherwise the casino wins with 1:1 odds
However there is a catch, in between rounds the casino flips a coin, if heads the players win on one number higher than they previously would (so they would win on 1-17 now), if tails the casino wins on one number higher (so they would win on 16-36, 0, 00) then the players can bet anew
This keeps going for an arbitrary number of spins call it Y and then the house resets the roulette wheel
For what values of Y would the players have an edge if any? I would suspect that larger values of Y would result in player advantages becoming real as the player can vary their bets in accordance with changing EV while the casino Can't. In a simple case Y=3 would clearly result in a positive house advantage.
Do values between $1 and $4 make sense ever? If so why?
How does changing the value of X change the house edge?
It clearly pushes the house edge in a negative direction but how much?
Important examples include: Blackjack with perfect card counting, and the various poker variants.
There are a large variety of issues to solve
1.) a given bet can be -EV but it would reduce the house edge to bet the maximum (for instance making a bet with 1% house edge in a game that normally would have 3% house edge)
2.) handling varying bet sizes depending on the house edge is tricky, I would assume a player would make 1 of 2 bets, Bet 1 is the minimum or X, bet 2 is the maximum bet size or N*X. Depending on the value of N house advantage could shift quite wildly in a game. This is most relevant in blackjack where betting is a continous process.
3.) How do you handle bets with even odds? This is similar to 1, but if a bet has even odds a player would be better off betting the maximum if the game is not beatable or the minimum if it is.
When trying to work on this problem I resolved by making the following assumptions
1. At first assume the game is beatable in the long run.
2. Players play to maximise Money EV, not proportional edge, so if a bet has any negative house advantage the player bets the maximum, and if the bet has neutral house edge or a house edge bet the minimum.
Is it ever rational to bet any amount that isn't the largest value or the smallest possible value if so what case and why?
Instead of making real blackjack I'm starting with simulated non independent trials game, the rules are simple (so probabilities can be calculated)
Players may place a bet of $1 or $4 (call the second bet's value X)
The Casino then spins a 38 number roulette wheel, if the number is 1-16 the players win otherwise the casino wins with 1:1 odds
However there is a catch, in between rounds the casino flips a coin, if heads the players win on one number higher than they previously would (so they would win on 1-17 now), if tails the casino wins on one number higher (so they would win on 16-36, 0, 00) then the players can bet anew
This keeps going for an arbitrary number of spins call it Y and then the house resets the roulette wheel
For what values of Y would the players have an edge if any? I would suspect that larger values of Y would result in player advantages becoming real as the player can vary their bets in accordance with changing EV while the casino Can't. In a simple case Y=3 would clearly result in a positive house advantage.
Do values between $1 and $4 make sense ever? If so why?
How does changing the value of X change the house edge?
It clearly pushes the house edge in a negative direction but how much?
October 20th, 2015 at 6:08:14 AM
permalink
Your roulette example is clear and similar to the way Blackjack sways to and fro the House. Simulations (the figures here are based on mine, but there are better sources elsewhere) show the House Edge variations depend heavily on the number of decks and the penetration. Thus fewer decks and more penetration give more opportunities where the count is favourable.
Eliot introduced a measure based on a perfect back-counter watching 100 hands and only betting $100 wherever it was in the player's advantage. The figure was the expected profit from this. (In your game the counter would bet wherever there were 20+ numbers paying evens - so you can calculate or simulate this.) For instance I found, based on UK rules (about 0.48%), six decks 66% penetration gave $18 and 83% penetration gave $27; four decks $26 and $39.
Based on my simulations, the approximate House Edge (6 decks, 66% pen) was
Count = -4 : 2.5%
Count = -2 : 1.2%
Count = 0 : 0.4%
Count = +1 : Even
Count = +2 : -0.6% (i.e. Player advantage)
Count = +4 : -1.4%
The game was in the player's advantage about 26% of the time. This went up to 36% of the time with 4 decks and 83% penetration.
btw if you're really into Blackjack analysis, one idea is to adjust your game so you start with a shoe of cards (i.e. number of decks) but some jokers (losers) added. You win on a Red card and lose on a Black card. If lots of Black cards (or jokers) have gone, then it will swing in the player's favour. However, unlike your roulette game which is a random walk, the cards will tend to move back towards the more balanced state and the effect of removing each card will depend on the number of decks left.
Eliot introduced a measure based on a perfect back-counter watching 100 hands and only betting $100 wherever it was in the player's advantage. The figure was the expected profit from this. (In your game the counter would bet wherever there were 20+ numbers paying evens - so you can calculate or simulate this.) For instance I found, based on UK rules (about 0.48%), six decks 66% penetration gave $18 and 83% penetration gave $27; four decks $26 and $39.
Based on my simulations, the approximate House Edge (6 decks, 66% pen) was
Count = -4 : 2.5%
Count = -2 : 1.2%
Count = 0 : 0.4%
Count = +1 : Even
Count = +2 : -0.6% (i.e. Player advantage)
Count = +4 : -1.4%
The game was in the player's advantage about 26% of the time. This went up to 36% of the time with 4 decks and 83% penetration.
btw if you're really into Blackjack analysis, one idea is to adjust your game so you start with a shoe of cards (i.e. number of decks) but some jokers (losers) added. You win on a Red card and lose on a Black card. If lots of Black cards (or jokers) have gone, then it will swing in the player's favour. However, unlike your roulette game which is a random walk, the cards will tend to move back towards the more balanced state and the effect of removing each card will depend on the number of decks left.