Need some help with this problem.. Feel like I'm missing a variable or something.
A ball is thrown into the air and doesn't return to the ground for 6.25 seconds. Determine the initial velocity at which the ball was thrown.
In a lot of physics problems, you have to make lots of assumptions! In this problem, we will be assuming that the thrower is on earth (so the acceleration due to gravity is -9.8 m/s^2). Also, we can say that air resistance is negligible because no data about the ball was given.
With that given, we can also figure that the velocity of the ball at the peak of the arc is 0 (at least vertically, which is the only direction we care about). Also, the ball would rise and fall in the same amount of time, so the max height would be hit after ~3.13 seconds.
Now, we have a=-9.8 m/s^2 , Vf = 0 m/s , and t=3.13s
You can use Vf = Vi + a*t
This will solve for the initial velocity Vi, with Vi = Vf - a*t