A stone is dropped from a height of 9 meters above the ground. If the height function can be modeled by the equation h(t)=a−t2, where t is time in seconds, h is height in meters, and a is the initial height, how many seconds does it take for the stone to hit the ground?
Height of the ball is under function, h(t)=a−t2
at time, t=0,h=9
∴9=a−0
a=9
We have to find time at which h=0
0=9−t2
t2=9
t=3 (correct Answer)