May 16th, 2018 at 11:22:05 AM
permalink
Hi everyone,
I'd like to calculate the standard deviation per hour in dollars for a game, given the variance data. What I mean is a value that tells me how much money corresponds to 1 SD/h (+ or -). Any idea on how to do that?
Example can be $100 bet with 2.5 variance, 30 rounds/h.
Thanks
I'd like to calculate the standard deviation per hour in dollars for a game, given the variance data. What I mean is a value that tells me how much money corresponds to 1 SD/h (+ or -). Any idea on how to do that?
Example can be $100 bet with 2.5 variance, 30 rounds/h.
Thanks
May 16th, 2018 at 11:33:31 AM
permalink
May 16th, 2018 at 11:47:07 AM
permalink
Quote: gunbjHi everyone,
I'd like to calculate the standard deviation per hour in dollars for a game, given the variance data. What I mean is a value that tells me how much money corresponds to 1 SD/h (+ or -). Any idea on how to do that?
Example can be $100 bet with 2.5 variance, 30 rounds/h.
Thanks
Variance is additive for uncorrelated events. So the variance for the sum of the outcomes of 30 bets each of variance 2.5 would be 30 * 2.5 = 75.
The standard deviation for the sum of those 30 $100 bets would be $100 times the square root of 75, which is about $866.