dpo28 wrote:
On a particular trip, Jane drove x miles at 30 miles per hour and y miles at 40 miles per hour. Was her average speed for the trip greater than 35 miles per hour?
(1) y > x + 60
(2) y/x > 4/3
On a particular trip, Jane drove x miles at 30 miles per hour and y miles at 40 miles per hour. Was her average speed for the trip greater than 35 miles per hour?\(average \ speed = \frac{total \ distance}{total \ time}=\frac{x+y}{(\frac{x}{30}+\frac{y}{40})}=\frac{120(x+y)}{4x+3y}\)
Thus the question asks whether \(\frac{120(x+y)}{4x+3y}>35\).
After simplifying the question becomes: is \(\frac{y}{x}>\frac{4}{3}\).
(1) y > x + 60.
If x = 1 and y = 100, then y/x > 4/3.
If x = 1000 and y = 1200, then y/x < 4/3.
Not sufficient.
(2) y/x > 4/3. Directly gives an answer to our question. Sufficient.
Answer: B.
Hope it's clear.
Alternatively, notice that 35 miles per hour is halfway between 30 and 40 miles per hour. Now, for the average speed to be exactly 35 miles per hour, the amount of time spent to cover x miles (x/30 hours) must equal to the amount of time spent to cover y miles (y/40 hours). The average speed to be greater than 35 miles per hour, the amount of time spent to cover x miles at the slower speed (x/30 hours) must be less than the amount of time spent to cover y miles at the faster speed(y/40 hours):
x/30 < y/40.
Then continue as above.