Here's a question: the boss specifies that the margin of error should be 5 at 95%. The standard deviation is known at 10. What is the minimum sample size required to achieve that margin of error?
This question is important because sampling is expensive and we don't want to take more samples than we need to achieve some specific level of accuracy. There is no point. You are throwing away money.
Recall how we find the margin of error at 95% by hand:
M of E = 1.96 * SE. And the standard error is the standard deviation / square root of sample size.
Here we are given the M of Error (the boss says make it 5) and we know the standard deviation is 10. Put in what you know:
5 = 1.96 * 10/square root of n
Do the algebra and switch around the 5 and the square root of n
so square root of n = 1.96 * 10/5
square root of n = 3.92
We want n not the square root of n. So SQUARE both sides to get
n = 3.92 * 3.92 = 15.37 BUT now you MUST round UP to 16. Not possible to have an incomplete sample.
No comments:
Post a Comment