dicemandingo
dicemandingo
  • Threads: 1
  • Posts: 1
Joined: Jun 22, 2013
June 22nd, 2013 at 10:37:32 AM permalink
I was wondering if someone could answer a confidence interval question with regards to throwing craps at the dice table.

With craps you are supposed to have a roll-to-sevens-ration (RSR) of 6.0

I am curious what the sample size would be necessary to have a 90% (and/or 95%, 99%) confidence ratio to be rolling something at a better than 6.0 RSR?

There are some (possibly) useful numbers regarding this experiment to be found at this wizardofodds appendix:

Let us assume that I have a 5,000 roll sample averaging a 6.4 RSR. How statistically meaningful would that be? 10,000 rolls? 20,000?

How do those numbers change if for the 5000 (and 10,000 and 20,000) sample if the RSR is something higher, say 6.7?

Is the variance simply too high to have any sort of confidence in those numbers?
  • Jump to: