enigma123
One computer can upload 100 megabytes worth of data in 6 seconds. Two computers, including this one, working together, can upload 1300 megabytes worth of data in 42 seconds. How long would it take for the second computer, working on its own, to upload 100 megabytes of data?
(A) 6
(B) 7
(C) 9
(D) 11
(E) 13
This is how I am trying to solve this question:
Let say A be the computer 1 and B be the computer 2
Computer A 1 second work : 100/6
Computer A + Computer B 1 second work : 1300/42
1/A + 1/B = AB/A+B
We have to find 1/B. I struggle to complete this question.
Well, you have to understand the fundamental difference between time and rate.
TIME is not an additive quantity but RATE is.
Meaning you can add rates directly to get a valid new rate. (This never happens in time; remember those tricky average speed problems

)
So here first computer's rate (R1) is 100/6 ------eq1
and rate of both computer (R1+R2)= 1300/42 ------eq2
so if you subtract eq 2 from eq 1 you will get the rate of second computer alone
1300/42-100/6= 700/42
R2 is 700/42
R2*T= work
R2*T=100
700/42*T=100
T=6
ANSWER IS A