eybrj2 wrote:
A driver completed the first 20 miles of a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)
A. 65
B. 68
C. 70
D. 75
E. 80
The total distance is 40 miles, and we want the average speed to be 60 miles per hour. Average speed = (total distance)/(total time)
So, we get: 60 = (40 miles)/(total time)
Solve equation to get: total time =
2/3 hoursSo, the TIME for the ENTIRE 40-mile trip needs to be
2/3 hours. The driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour.How much time was spent on this FIRST PART of the trip?
time = distance/speed
So, time = 20/50 =
2/5 hoursThe ENTIRE trip needs to be
2/3 hours, and the FIRST PART of the trip took
2/5 hours2/3 hours -
2/5 hours =
10/15 hours -
6/15 hours=
4/15 hoursSo, the SECOND PART of the trip needs to take
4/15 hoursThe SECOND PART of the trip is 20 miles, and the time is
4/15 hoursSpeed = distance/time
So, speed = 20/(
4/15)
= (20)(15/4)
= 75 mph
Answer: D
Cheers,
Brent
_________________
Brent Hanneson – Creator of gmatprepnow.com
I’ve spent the last 20 years helping students overcome their difficulties with GMAT math, and the biggest thing I’ve learned is…
Many students fail to maximize their quant score NOT because they lack the skills to solve certain questions but because they don’t understand what the GMAT is truly testing -
Learn more