A commuter regularly drives 70 miles from home to work, and the amount of time required for the trip varies widely as a result of road and traffic conditions. The average speed for such a trip is a function of the time required. For example, if the trip takes 2 hours, then the average speed is

70

2

= 35

miles per hour.

(a) What is the average speed if the trip takes an hour and a half? (Round your answer to three decimal places.)

Thank you for the opportunity to help you with your question!

The average speed is distance divided by time. In this case the distance is 70 miles and the time is 1.5 hours. So speed which is s=70/1.5=46.667 miles per hour.

Please let me know if you need any clarification. I'm always happy to answer your questions.