Thank you for using the timer - this advanced tool can estimate your performance and suggest more practice questions. We have subscribed you to Daily Prep Questions via email.

Customized for You

we will pick new questions that match your level based on your Timer History

Track Your Progress

every week, we’ll send you an estimated GMAT score based on your performance

Practice Pays

we will pick new questions that match your level based on your Timer History

Not interested in getting valuable practice questions and articles delivered to your email? No problem, unsubscribe here.

Thank you for using the timer!
We noticed you are actually not timing your practice. Click the START button first next time you use the timer.
There are many benefits to timing your practice, including:

A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

Updated on: 30 Oct 2018, 00:09

6

26

00:00

A

B

C

D

E

Difficulty:

35% (medium)

Question Stats:

76% (01:57) correct 24% (02:37) wrong based on 1241 sessions

HideShow timer Statistics

A driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-mile trip? (Assume that the driver did not make any stops during the 40-mile trip.)

A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

09 Mar 2014, 14:05

3

28

SOLUTION

A driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-mile trip? (Assume that the driver did not make any stops during the 40-mile trip.)

\(average \ speed=\frac{total \ distance}{total \ time}=\frac{40}{total \ time}=60\). This implies that for the average time to be 60 miles per hour, the total time must be 40/60 = 2/3 hours.

Now, the first 20 miles were covered in (time) = (distance)/(speed) = 20/50 = 2/5 hours.

Thus, the remaining 20 miles should be covered in 2/3 - 2/5 = 4/15 hours, which means that the remaining 20 miles should be covered at an average speed (distance)/(time) = 20/(4/15) = 75 miles per hour.

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

13 Apr 2012, 21:00

2

eybrj2 wrote:

A driver completed the first 20 miles od a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

a) 65

b) 68

c) 70

d) 75

e) 80

Why not 70? 50 mph + x mph / 2 = 60 mph, so x = 70 since the first 20 miles ans the other 20 miles are the same distnace. What's wrong with my reasoning?

LET X=20 MILES

x/50+x/y=2x/60 => 1/y=1/30-1/50=1/75 =>y=75

HENCE D.

P.S.: You are doing direct average/ weighted average of speed, thats wrong. you need to check that the time it takes to cover the two individual 20 miles trip should be equal to the total time its takes to cover 40 miles with average speed 60 mph.

Hope this helps...!!
_________________

Practice Practice and practice...!!

If my reply /analysis is helpful-->please press KUDOS If there's a loophole in my analysis--> suggest measures to make it airtight.

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

13 Apr 2012, 23:27

5

1

eybrj2 wrote:

A driver completed the first 20 miles od a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

a) 65

b) 68

c) 70

d) 75

e) 80

Why not 70? 50 mph + x mph / 2 = 60 mph, so x = 70 since the first 20 miles ans the other 20 miles are the same distnace. What's wrong with my reasoning?

avg speed = total distance/total time

t1 = 20/50h = 0.4h t2 = 20/x h

60 = 40/(0.4 + 20/x)

x= 75
_________________

Regards, Harsha

Note: Give me kudos if my approach is right , else help me understand where i am missing.. I want to bell the GMAT Cat

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

21 Oct 2013, 01:14

2

1

eybrj2 wrote:

A driver completed the first 20 miles od a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

A. 65 B. 68 C. 70 D. 75 E. 80

Why not 70? 50 mph + x mph / 2 = 60 mph, so x = 70 since the first 20 miles ans the other 20 miles are the same distnace. What's wrong with my reasoning?

Since the distance is same in both stretches of the journey,therefore average speed is Harmonic mean of the speed

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

12 Jan 2014, 12:36

1

sem wrote:

A driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-mile trip? (Assume that the driver did not make any stops during the 40-mile trip.)

A. 65 mph B. 68 mph C. 70 mph D. 75 mph E. 80 mph

Basically, the first 20 miles took 24 minutes, so the second 20 miles need to take 16 minutes in order for the average to be 60miles/h..

We need to pick an option from A-E (which we call X), that in the denominator makes 120/(16*x) = 1. Note that the value needs to be divided by 10 before it is multiplied by 16.. The only value that works is D (16 * 7.5 = 120), and thus D is the answer.

Don't even ask me how I came to solve it with this convoluted mumbo jumbo but my brain worked on full gear and at the time that I did this it made perfect sense, even though Im not that good at explaining the whole process in hindsight.

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

20 Jan 2014, 03:59

2

You can use the "normal" distance/rate approach.

First, divide the trip: For the whole trip he has to take 40 minutes (2/3 h) because he is driving at an average speed of 60m/h. So first 20 miles at 50 m/h means that he takes 24 min (2/5 h) for half the distance. This means that he has to take 16 minutes = 16/60 = 4 / 15 h for the last 20 miles. This gives us the equation:

distance = rate(x) * time 20 = x * 4/15 20*15/4 = x 300/4 = x 75 = x

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

21 Jan 2014, 16:16

aeglorre wrote:

sem wrote:

A driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-mile trip? (Assume that the driver did not make any stops during the 40-mile trip.)

A. 65 mph B. 68 mph C. 70 mph D. 75 mph E. 80 mph

Basically, the first 20 miles took 24 minutes, so the second 20 miles need to take 16 minutes in order for the average to be 60miles/h..

We need to pick an option from A-E (which we call X), that in the denominator makes 120/(16*x) = 1. Note that the value needs to be divided by 10 before it is multiplied by 16.. The only value that works is D (16 * 7.5 = 120), and thus D is the answer.

Don't even ask me how I came to solve it with this convoluted mumbo jumbo but my brain worked on full gear and at the time that I did this it made perfect sense, even though Im not that good at explaining the whole process in hindsight.

why are you setting your problem to 40 miles in 40 mins? Where did 120 come from?

If you cant explain ANYTHING why write an explanation?

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

21 Jan 2014, 20:33

3

1

TroyfontaineMacon wrote:

aeglorre wrote:

sem wrote:

A driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-mile trip? (Assume that the driver did not make any stops during the 40-mile trip.)

A. 65 mph B. 68 mph C. 70 mph D. 75 mph E. 80 mph

Basically, the first 20 miles took 24 minutes, so the second 20 miles need to take 16 minutes in order for the average to be 60miles/h..

We need to pick an option from A-E (which we call X), that in the denominator makes 120/(16*x) = 1. Note that the value needs to be divided by 10 before it is multiplied by 16.. The only value that works is D (16 * 7.5 = 120), and thus D is the answer.

Don't even ask me how I came to solve it with this convoluted mumbo jumbo but my brain worked on full gear and at the time that I did this it made perfect sense, even though Im not that good at explaining the whole process in hindsight.

why are you setting your problem to 40 miles in 40 mins? Where did 120 come from?

It's an instinctive method you often use when you learn to play with numbers in your head. You want the average speed to be 60 miles/hr i.e. you need to cover 60 miles in 60 mins which means you must cover 40 miles in 40 mins. The first 20 miles were covered at an average speed of 50 mph i.e. time taken (in mins) to cover the first 20 miles is 20/50 * 60 = 24 mins You need to cover 40 miles in total 40 mins and you have already taken 24 mins during the first 20 miles. This means, you need to speed up now and cover the rest of the 20 miles in the leftover 16 mins. What will be your speed in mph if you cover 20 miles in 16/60 hrs? Speed = 20/(16/60) = 75 mph
_________________

Karishma Veritas Prep GMAT Instructor

Learn more about how Veritas Prep can help you achieve a great GMAT score by checking out their GMAT Prep Options >

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

23 Jan 2014, 04:37

2

A driver completed the first 20 miles of a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

A. 65 B. 68 C. 70 D. 75 E. 80

Let \(x\) be the average speed during remaining 20 miles

Total Trip Time = Time to cover first 20 miles + time to cover remaining 20 miles Since total trip distance is twice that of first part of the trip, we may write above equation as

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

17 Jun 2015, 07:15

eybrj2 wrote:

A driver completed the first 20 miles of a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

Why not 70? 50 mph + x mph / 2 = 60 mph, so x = 70 since the first 20 miles ans the other 20 miles are the same distnace. What's wrong with my reasoning?

Can we use this formula:

60 = 20/50 + 20/x ?? If yes, what is the way to solve for x ?

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

17 Jun 2015, 07:29

2

1

LaxAvenger wrote:

eybrj2 wrote:

A driver completed the first 20 miles of a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

Why not 70? 50 mph + x mph / 2 = 60 mph, so x = 70 since the first 20 miles ans the other 20 miles are the same distnace. What's wrong with my reasoning?

Can we use this formula:

60 = 20/50 + 20/x ?? If yes, what is the way to solve for x ?

NO, you can't use this Principle. In fact the only principle for calculating Average Speed is

Average Speed = Total Distance / Total Time

Solving Equation for you:

Here, Average Speed = 60 Total Distance = 40 Total Time = Time taken in Travelling 1st 20 Miles + Time taken in Travelling 2nd 20 Miles = (20/50) + (20/x) [Because Time = Distance/Speed]

Prosper!!! GMATinsight Bhoopendra Singh and Dr.Sushma Jha e-mail: info@GMATinsight.com I Call us : +91-9999687183 / 9891333772 Online One-on-One Skype based classes and Classroom Coaching in South and West Delhi http://www.GMATinsight.com/testimonials.html

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

17 Jun 2015, 08:30

GMATinsight wrote:

LaxAvenger wrote:

eybrj2 wrote:

A driver completed the first 20 miles of a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

Why not 70? 50 mph + x mph / 2 = 60 mph, so x = 70 since the first 20 miles ans the other 20 miles are the same distnace. What's wrong with my reasoning?

Can we use this formula:

60 = 20/50 + 20/x ?? If yes, what is the way to solve for x ?

NO, you can't use this Principle. In fact the only principle for calculating Average Speed is

Average Speed = Total Distance / Total Time

Solving Equation for you:

Here, Average Speed = 60 Total Distance = 40 Total Time = Time taken in Travelling 1st 20 Miles + Time taken in Travelling 2nd 20 Miles = (20/50) + (20/x) [Because Time = Distance/Speed]

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

17 Jun 2015, 08:38

1

LaxAvenger wrote:

GMATinsight wrote:

LaxAvenger wrote:

A driver completed the first 20 miles of a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

Why not 70? 50 mph + x mph / 2 = 60 mph, so x = 70 since the first 20 miles ans the other 20 miles are the same distnace. What's wrong with my reasoning?

Can we use this formula:

60 = 20/50 + 20/x ?? If yes, what is the way to solve for x ?

NO, you can't use this Principle. In fact the only principle for calculating Average Speed is

Average Speed = Total Distance / Total Time

Solving Equation for you:

Here, Average Speed = 60 Total Distance = 40 Total Time = Time taken in Travelling 1st 20 Miles + Time taken in Travelling 2nd 20 Miles = (20/50) + (20/x) [Because Time = Distance/Speed]

What about if we use this equation: (from first answer)

60 = 40/(0.4 + 20/x)

x= 75

?[/quote]

This equation 60 = 40/(0.4 + 20/x) is same as the equation I mentioned in my solution

i.e. 60 = 40/ [20/50 + 20/x] _________________

Prosper!!! GMATinsight Bhoopendra Singh and Dr.Sushma Jha e-mail: info@GMATinsight.com I Call us : +91-9999687183 / 9891333772 Online One-on-One Skype based classes and Classroom Coaching in South and West Delhi http://www.GMATinsight.com/testimonials.html

A driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-mile trip? (Assume that the driver did not make any stops during the 40-mile trip.)

average rate = (distance 1 + distance 2)/(time 1 + time 2)

where average rate = 60, distance 1 = distance 2 = 20, time 1 = distance 1/rate 1 = 20/50 = 2/5, and time 2 = distance 2/rate 2 = 20/r (where r is the average speed of the remaining 20 miles).

Let’s now determine r:

60 = (20 + 20)/(2/5 + 20/r)

60 = 40/(2r/5r + 100/5r)

60 = 40/[(2r + 100)/5r]

60 = 200r/(2r + 100)

60(2r + 100) = 200r

120r + 6000 = 200r

6000 = 80r

r = 6000/80 = 600/8 = 75

Answer: D
_________________

Jeffery Miller Head of GMAT Instruction

GMAT Quant Self-Study Course 500+ lessons 3000+ practice problems 800+ HD solutions

A driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-mile trip? (Assume that the driver did not make any stops during the 40-mile trip.)

The total distance is 40 miles, and we want the average speed to be 60 miles per hour. Average speed = (total distance)/(total time) So, we get: 60 = (40 miles)/(total time) Solve equation to get: total time = 2/3 hours So, the TIME for the ENTIRE 40-mile trip needs to be 2/3 hours.

driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. How much time was spent on this FIRST PART of the trip? time = distance/speed So, time = 20/50 = 2/5 hours

The ENTIRE trip needs to be 2/3 hours, and the FIRST PART of the trip took 2/5 hours

2/3 hours - 2/5 hours = 10/15 hours - 6/15 hours = 4/15 hours So, the SECOND PART of the trip needs to take 4/15 hours

The SECOND PART of the trip is 20 miles, and the time is 4/15 hours Speed = distance/time So, speed = 20/(4/15) = (20)(15/4) = 75

Re: A driver completed the first 20 miles of a 40-mile trip at an average
[#permalink]

Show Tags

22 Mar 2018, 08:05

Top Contributor

eybrj2 wrote:

A driver completed the first 20 miles of a 40 miles trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-miles trip? ( Assume that the driver did not make any stops during the 40-miles trip)

A. 65 B. 68 C. 70 D. 75 E. 80

The total distance is 40 miles, and we want the average speed to be 60 miles per hour. Average speed = (total distance)/(total time) So, we get: 60 = (40 miles)/(total time) Solve equation to get: total time = 2/3 hours So, the TIME for the ENTIRE 40-mile trip needs to be 2/3 hours.

The driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. How much time was spent on this FIRST PART of the trip? time = distance/speed So, time = 20/50 = 2/5 hours

The ENTIRE trip needs to be 2/3 hours, and the FIRST PART of the trip took 2/5 hours

2/3 hours - 2/5 hours = 10/15 hours - 6/15 hours = 4/15 hours So, the SECOND PART of the trip needs to take 4/15 hours

The SECOND PART of the trip is 20 miles, and the time is 4/15 hours Speed = distance/time So, speed = 20/(4/15) = (20)(15/4) = 75 mph

Answer: D

Cheers, Brent
_________________

Test confidently with gmatprepnow.com

gmatclubot

Re: A driver completed the first 20 miles of a 40-mile trip at an average &nbs
[#permalink]
22 Mar 2018, 08:05