Computers from a particular company are found to last on average for three years
ID: 3234783 • Letter: C
Question
Computers from a particular company are found to last on average for three years without any hardware malfunction, with standard deviation of two months. At least what percent of the computers last between 31 months and 4 months? What is the smallest number of standard deviation from the mean that we must go if we want to ensure that we have at least 80% of all data of a distribution? From past experience, it is known that the number of tickets purchased by a person standing in line at the ticket window for the football match follows a distribution that has mean mu = 4.5 and standard deviation sigma = 2.0. Suppose that few hours before the start of one of these matches there are 40 persons standing in line to purchase tickets. If only 190 tickets remain, what is the probability that all 40 persons will be able to purchase 12p the tickets they desire?Explanation / Answer
1) mean =3years=36 months
as 31 and 41 month fall k=5/2 =2.5 std deviation away from mean
hence from Chebychev P(31<X<41)=1-1/k2 =1-1/2.52 =0.84 ~84%
b)from above for 80% CI, z=1.28
hence minimum std deviation =1.28
2)mean for 40 people=4.5*40=180
and std deviation=2*(40)1/2=12.649
hence P(X<190)=P(Z<(190.5-180)/12.649)=P(Z<0.751)=0.7737
Related Questions
Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.