Bunuel wrote:
Set J consists of the terms {a, b, c, d, e}, where e > d > c > b > a > 1. Which of the following operations would decrease the standard deviation of Set J?
A. Multiply each term by e/d
B. Divide each term by b/c
C. Multiply each term by −1
D. Divide each term by d/e
E. Multiply each term by c/e
Kudos for a correct solution.
VERITAS PREP OFFICIAL SOLUTION:Standard deviation essentially measures how far the terms in a set deviate from the mean. Because of that, if you can tell whether the new terms are more spread out or more condensed, you can tell what will happen to the standard deviation in comparison with the "old" set. If the numbers are more spread out - for example, if the numbers are multiplied by a number with an absolute value greater than 1 - the standard deviation will increase. And if the numbers are closer together - for example, if they're divided by a number with a standard deviation greater than 1 - the standard deviation will increase.
For quick demonstration, consider the set 1, 2, 3, 4, 5. If you multiply all those numbers by 2, the set becomes farther spread at 2, 4, 6, 8, 10. But if you divide them each by 2 they get closer together: 0.5, 1, 1.5, 2, 2.5.
So from these answer choices:
Multiply each term by e/d - since this fraction is greater than 1, this will increase the dispersion and the standard deviation.
Divide each term by b/c - since this fraction is less than 1 (but greater than 0), dividing by it will again expand the set.
Multiply each term b/y −1 - changing all the terms from positive to negative doesn't change the dispersion, so there will be no change.
Divide each term by d/e - like answer choice B, this will lead to an expanded set.
Multiply each term by c/e - because c is less than e, this fraction is between 0 and 1, meaning that multiplying it has the effect of bringing the terms closer together. This reduces the standard deviation.