A baseball player at second base throws a ball 90 feet to the player at first ba
ID: 3186565 • Letter: A
Question
A baseball player at second base throws a ball 90 feet to the player at first base. The ball is released at a point 5 feet above the ground with an initial velocity of 50 miles per hour and at an angle of 15 degree above the horizontal. At what height does the player at first base catch the ball? Take the acceleration due to gravity to be a constant 32 feet per second squared and ignore air resistance. Hints: there are 3600 seconds in 1 hour, there are 5280 feet in one mile, and see 12.3 example 6 for a similar problem.Explanation / Answer
r(t)= vo cos t i + (h+vo sin t -16t2)j
length component l(t) = vo cos t
height component h(t) = h+vo sin t -16t2
given data
= 15 degrees , h = 5 ft, initial velocity = 50mph
v0 = 50 mph = (50x5280)/3600 = 73.3 ft/s
now, r(t) = 73.3 cos15 t i + (5+73.3 sin 15t -16t2)j
when l = 90ft = 73.3 cos15 t
==> t = 1.27 sec
now, reqd. height
= 5 + 73.3sin15 (1.27- 16 (1.27)2 = 3.287 ft
Related Questions
Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.