How long did it take Carlos to drive nonstop from his home to Atlanta, Georgia?

(1) Carloss average speed for the trip was 55 miles per hour.

(2) If Carloss average speed for the trip had been 20 percent faster, the trip wouldve taken 3 hours.

How come the answer can be B?

This is how I am trying to do it.

Statement 1

Average speed = Total Distance/Total Time

We don't know the distance and therefore cannot calculate the time.

Statement 2

Let the initial speed be s

New speed = s + 0.2(s) = 1.2 s

1.2 S = D/T

Again as we don't know the Distance and Speed we cannot calculate the time.

But combining the 2 statement we can calculate the time. So where I have got it wrong?

_________________

Best Regards,

E.

MGMAT 1 --> 530

MGMAT 2--> 640

MGMAT 3 ---> 610