jchay02
jchay02
24.03.2020 • 
Mathematics

Suppose it is desired to estimate the average time a customer spends in Dollar Tree to within 5 minutes at 99% reliability. It is estimated that the standard deviation of the times is 15 minutes. How large a sample should be taken to get the desired interval?

Solved
Show answers

Ask an AI advisor a question