An object launched upward with an initial velocity of 9.8m/s from a height of 73.5m will have a height of s(t)=(-4.9t^2)+ 9.8t + 73.5m, where s is in meters and t is in seconds. How long will it take the object to hit the ground.
I recieved this homework problem and I can't figure out how to start it, I was wondering if anyone could at least get me on the right track. Thank you
I recieved this homework problem and I can't figure out how to start it, I was wondering if anyone could at least get me on the right track. Thank you