smartass666 wrote:

Given that the mean of Set A is 10, what is the range of two standard deviations above and below the mean?

(1) One standard deviation above and below the mean ranges from 7 to 13.

(2) The median of set A is 11.

We are given that the mean of set A is 10, and we need to determine the range of two standard deviations above and below the mean. If we let s = the standard deviation, we need to calculate:

10 + 2s - (10 - 2s)

If we simplify this expression, we have 4s. That is, if we can determine the standard deviation, we can determine the value of 4s.

Statement One Alone:

One standard deviation above and below the mean ranges from 7 to 13.

Using the information in statement one, we can say that 10 - s = 7 and 10 + s = 13. Solving s in either equation, we have s = 3. Thus, 4s = 12.

Statement one alone is sufficient to answer the question. We can eliminate answer choices B, C, and E.

Statement Two Alone:

The median of set A is 11.

Knowing the median of the set is not enough information to determine the standard deviation. Statement two alone is not sufficient to answer the question.

Answer: A

_________________

Scott Woodbury-Stewart

Founder and CEO

GMAT Quant Self-Study Course

500+ lessons 3000+ practice problems 800+ HD solutions