An object is thrown in the air with an initial velocity of 5 m/s from a height of 9 m. The equation h(t) = –4.9t2 + 5t + 9 models the height of the object in meters after t seconds. About how many seconds does it take for the object to hit the ground? Round your answer to the nearest hundredth of a second.

Question 11 options:

0.94 seconds


1.50 seconds


2.00 seconds


9.00 seconds

Respuesta :

Ematio
After being thrown, the object would take about 2 seconds to hit the ground. We can find this by looking at the second x-intercept, which is where the graphed equation crosses the line on the right.

Answer:

The time taken by the object to hit the ground is 2 seconds.

Step-by-step explanation:

It is given that,

Initial velocity of an object, u = 5 m/s

Height, h = 9 m

The equation that models the height of the object in meters after t seconds is :

[tex]h(t)=-4.9t^2+5t+9[/tex]

We have to find the time for the object to hit the ground.

i.e. [tex]-4.9t^2+5t+9=0[/tex]        

On solving the above quadratic equation, we get the value of time t is :

t = 1.958 seconds

or

t = 2 seconds

Hence, the correct option is (c) " 2 seconds ".