A regional chain of fast-food restaurants would like to estimate the mean time r
ID: 3379277 • Letter: A
Question
A regional chain of fast-food restaurants would like to estimate the mean time required to serve its drive-thru customers. Because speed of service is a critical factor in the success of its restaurants, the chain wants to have as accurate an estimate of the mean service time as possible. If the population standard deviation of service times is known to be 30 seconds, how large a sample must be used to estimate the mean service time for drive-thru customers if a margin of error of no more than plusminus 10 seconds of the true mean with a 99% level of confidence is required? Suppose the manager believes that a margin of error of plusminus 10 seconds is too high and has decided it should be cut in half to a margin of error of plusminus 5 seconds. He is of the opinion that by cutting the margin of error in half, the required sample size will double over what was required for a margin of error of plusminus 10. Is the manager correct concerning the sample-size requirement for the reduced margin of error? (Provide supporting calculations.)Explanation / Answer
A)
Note that
n = z(alpha/2)^2 s^2 / E^2
where
alpha/2 = (1 - confidence level)/2 = 0.005
Using a table/technology,
z(alpha/2) = 2.575829304
Also,
s = sample standard deviation = 30
E = margin of error = 10
Thus,
n = 59.71406941
Rounding up,
n = 60 [ANSWER]
B)
Note that
n = z(alpha/2)^2 s^2 / E^2
where
alpha/2 = (1 - confidence level)/2 = 0.005
Using a table/technology,
z(alpha/2) = 2.575829304
Also,
s = sample standard deviation = 30
E = margin of error = 5
Thus,
n = 238.8562776
Rounding up,
n = 239 [ANSWER]
So, it was incorrect that it doubled, because it is close to being quadrupled.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.