Laracbo
When David drove from his home to his parents' home, was his average speed between 35 miles per hour and 50 miles per hour?
(1) To the nearest 100 miles, the distance that David drove from his home to his parents' home was 300 miles.
(2) To the nearest hour, it took David 8 hours to drive from his home to his parents' home.
We use rounding rules in this question.
To get the speed, we need both distance and time. Hence either statement alone is not sufficient.
Using both together, distance was 250 <= d < 350 and time taken was 7.5 <= t < 8.5.
If the distance covered were 300 miles and time taken was 8 hrs, speed would be 300/8 = 37.5 mph. It is very close to the lower limit of the speed we are considering, 35 mph.
Hence, check - if the distance were 250 miles and time was 8.5 hrs (it could easily be 8.499999 hrs), the speed would be much lower.
Speed = 250/8.5 = 500/17 = 2something (because 17*3 = 51) some the speed is not even in 30s. Even if time taken was a tiny bit less than 8.5 hrs, the speed is much less than 35 mph.
Hence speed less than 35 mph is possible and between 35 to 50 is possible.
Answer (E)