Airplanes A and B traveled the same 360-mile route. If airplane A took 2 hours and airplane B traveled at an average speed that was \(\frac{1}{3}\) slower than the average speed of airplane A, how many hours did it take airplane B to travel the route?

(A) 2

(B) \(2\frac{1}{3}\)

(C) \(2\frac{1}{2}\)

(D) \(2\frac{2}{3}\)

(E) 3

I agree with the OA.

However, something that I don't understand is why cannot analyze it in this way:

The question says that airplane B traveled at an average speed that was \(\frac{1}{3}\) slower than the average speed of airplane A, right?

The OE says that, based on this info, that airplane A travels at 180 mph, so airplane B travels at 120 mph (1/3 slower).

Why cannot "1/3 slower" mean this?

A ---- 180 miles / 1 hour

B ---- 180 miles /[(4/3)*1hour]

The answer would be different.

Please, your comments.

Source:

http://www.gmathacks.comOr recall that as distance is constant, then B will need 3/2 as much time as A to cover same distance