*n*

*k*-sided dice that I would like to roll. Rather than rolling the

*n*dice (which would take O(n)), I'm going to assume that the distribution is approximately normal (it is) and randomly select a integer z between 0 and 1000, where z/10 is the percentile; this will determine what the simulated value is (and it will be O(1)). In turn, this requires that I know the mean and standard deviation. The mean is easy; it is n(k + 1) / 2. The standard deviation, however, appears to be a bit harder. Using the

**multinomial distribution**appears to be a good start, but I've forgotten how to transform the standard deviation of a specific random variable into a single quantitative value for the whole data. May I simply multiply the standard deviation for each value by the quantitative value itself? thanks,