r019h wrote:

The price of a microchip declines by 67% every 6 months. At this rate approx. how many years will it take for an $81 chip to reach $1?

what's the fastest way to do this problem?

this is how i did it-

81- 67% * 81 = 27 approx

27 - 0.67*27 = 9 approx

9- 0.67*9 = 3 approx

3-0.67*3= $1 approx

hence, 24 months = 2 yrs

is there a faster way??

not sure whether you like this:

81 - (81 x 67%) / 3 = 81 - (81 x 2) / 3 = 27 = 81 / 3

1 = 81 / (3x3x3x3) = 81 / 3^4

so the time = 4 x 6 months = 24 months or 2 years.