tablecloth
tablecloth
  • Threads: 1
  • Posts: 2
Joined: Mar 28, 2014
March 28th, 2014 at 8:37:00 AM permalink
So I'm trying to figure out expected value, and I can't seem to wrap my brain around the practical meaning of expected value. Is expected value how much you can expect to win on average including how much you already spent to play it? Or is it how much you would win for each $ spent (so not counting how much you spent to play)?

A lot of examples use $1 lottery tickets as an example which I think is confusing me. What if a lottery ticket is $20? If you calculated the expected value and got 1.00, does that mean you can expect to win $1 each time you play? Or you are expected to win $1 but in reality you lose $19?

Does positive expected value for a game always mean that, over a lot of iterations, you will win, while a negative expected value mean you will lose?

Thanks in advance.
dwheatley
dwheatley
  • Threads: 25
  • Posts: 1246
Joined: Nov 16, 2009
March 28th, 2014 at 8:39:53 AM permalink
The convention is that expected value of a decision includes the costs, so a calculated EV of $1 for a $20 lottery ticket means you would get $21 in revenue on average, with a cost of $20. The example you actually gave has an EV of -$19.
Wisdom is the quality that keeps you out of situations where you would otherwise need it
tablecloth
tablecloth
  • Threads: 1
  • Posts: 2
Joined: Mar 28, 2014
March 28th, 2014 at 8:46:23 AM permalink
Got it, thanks for clearing that up.
kubikulann
kubikulann
  • Threads: 27
  • Posts: 905
Joined: Jun 28, 2011
March 28th, 2014 at 9:49:17 AM permalink
Quote: tablecloth

Does positive expected value for a game always mean that, over a lot of iterations, you will win, while a negative expected value mean you will lose?

Thanks in advance.

No.
It means that on average you will end up positive (resp. negative). But risk is risk: there is always a probability of being over or under the average.

Now if the average is +X and you want the probability of ending up >0, that probability is dwindling down. How fast depends on the ratio of expectation to standard deviation. But remember that variance is growing (proportional to the root of the number of trials). So the probability of getting arbitrarily close to the average is also decreasing.

That is what I call the 'reverse gambler's fallacy': to think that an average is something certain. When scientific people tell you that you'll always lose in the end (in a neg expectation game), they are in fact distorting the truth. They rely on averages without looking at variance. In other words, it depends on what "in the end" means in a practical setting.
Reperiet qui quaesiverit
AxiomOfChoice
AxiomOfChoice
  • Threads: 32
  • Posts: 5761
Joined: Sep 12, 2012
March 28th, 2014 at 12:13:32 PM permalink
Quote: tablecloth

So I'm trying to figure out expected value, and I can't seem to wrap my brain around the practical meaning of expected value. Is expected value how much you can expect to win on average including how much you already spent to play it? Or is it how much you would win for each $ spent (so not counting how much you spent to play)?

A lot of examples use $1 lottery tickets as an example which I think is confusing me. What if a lottery ticket is $20? If you calculated the expected value and got 1.00, does that mean you can expect to win $1 each time you play? Or you are expected to win $1 but in reality you lose $19?

Does positive expected value for a game always mean that, over a lot of iterations, you will win, while a negative expected value mean you will lose?

Thanks in advance.



Expected value is a mean (ie, an average). That's it.

If a billion people gambled exactly like you did, and you all played from a shared bankroll, your results (per person) would almost certainly be very, very close to your EV. Similarly, if you make the same bet over and over again, your average result will almost certainly be very, very close to the EV. Obviously, you will win some and lose some, but if you play for long enough and average them out, it will be close to the EV.

As for how long is "long enough" (since that is almost always the next question), the answer to that depends on how sure you want to be that you are close to the expected result, and how close you want to be (you will notice that I said it will almost certainly be close -- so, depending on how certain you want to be, and how close you want to be, there is a certain number of bets that are required to reach that level of confidence)
AceTwo
AceTwo
  • Threads: 5
  • Posts: 359
Joined: Mar 13, 2012
March 28th, 2014 at 1:45:42 PM permalink
Quote: tablecloth

So I'm trying to figure out expected value, and I can't seem to wrap my brain around the practical meaning of expected value. Is expected value how much you can expect to win on average including how much you already spent to play it? Or is it how much you would win for each $ spent (so not counting how much you spent to play)?



As other people and you also say said EV is Average.
To be more exact is the Weighted Average of all possible Outcomes. By Outcomes we mean Win or Loss (that does NOT include the initial wager).
Beting $1 and if win you getting $20 (including the initial bet) means a win of $19.
To avoid the issue of whethere you bet $1 or $100, you can call it 1 Unit in both cases.
Simple Example:
Buy 1 lottery ticket for $1, Prize is $20 (ie win $19), Number of tickets 25.
Win: 1/25 x +$19 = +$0.76
Lose:24/25 x -1 = -$0.96
EV -$0.20
So the EV is -$0.20.
If the ticket was $10 and the prize $200 the EV would be -$2.00

But we usually use EV per Unit (not for a specific $ amount) and use decimal or % (% is easier for most people to underestand)
So in the above example teh EV is just -0.20 or -20%.
The EV of -20% applies whatever the $ amount of the bet.

Quote: tablecloth


Does positive expected value for a game always mean that, over a lot of iterations, you will win, while a negative expected value mean you will lose?


The simple answer is YES.

The complicated answer is tell me the number of iterations and I tell you the Probability that you will be in front (in a Positive EV game) and vice versa.
As the number of iterations increase that probability approaches 100% but theoretically it never becomes 100%.
Say a theoretical game of a coin toss where you bet $1 and Win $1.000 where anyone can understand that it is 'impossible' to los even with 100 iterations.
This might be 99,9999999 (whatever of winning) There is still a theoretical change of losing.
RS
RS
  • Threads: 62
  • Posts: 8626
Joined: Feb 11, 2014
March 28th, 2014 at 4:08:13 PM permalink
There are actually 2 ways to look at EV.

The first is a percentage, or win/loss amount. If you look up the EV charts for blackjack playing decisions in Don's BJA, I believe this type of EV is used. For example, it might say +0.2581, which means you have a 25.81% edge (ie: for every $100 bet, you expect to win $25.81). This is more likely for games where you place a bet then get it back after a win or push (table games). It is shown in either positive or negative number. An EV of 0 is a break-even proposition.

The other way to look at EV is the amount you win on average, not including your losing wager. This is slot or video poker stuff, since in those games, you bet some amount (say $1), you lose that $1, and now have a chance to win that back plus some. Say you're dealt a pair of jacks (in 9/6 JoB), your EV is $1.53 based on https://wizardofodds.com/games/video-poker/strategy/jacks-or-better/9-6/optimal/ this type of EV is always positive, since it's only looking at how much you're getting back. Actually, I think this is referred to as ER or Expected Return.

For the example you gave ($20 ticket, and the EV is $1)....using the first type of EV, you will win $1 and get your original $20 wager back. Using the second type means you'll lose your $20 wager and win $1 back (for a $19 net loss).
  • Jump to: