Bunuel
If x and y are the standard deviations of two different data sets, is x > y?
(1) x is the standard deviation of the data set 50, 50, 50, 50, 50, 50, 50, 50 50, 50, 50.
(2) y is the standard deviation of the data set 40, 42, 44, 46, 48, 50, 52, 54 56, 58, 60.
Hi,
Standard Deviation(SD) is a measure of dispersion of a set of data from its mean. It is the
square root of the variance => the minimum value of SD will always be 0.
For more please refer:
https://gmatclub.com/forum/math-standar ... 87905.html Now, let's solve the question.
If x and y are the standard deviations of two different data sets, is x > y?
(1) x is the standard deviation of the data set 50, 50, 50, 50, 50, 50, 50, 50 50, 50, 50.
Since all the data points are same(i.e.50) => x (SD) = 0. Hence, x can't be greater than y. Sufficient.
(2) y is the standard deviation of the data set 40, 42, 44, 46, 48, 50, 52, 54 56, 58, 60.
Definitely value of y is greater than 0, and we have no information about the value of x. Hence, we can't determine whether x is greater than y or not. Insufficient.
Answer: (A).
Thanks.
Hi..
Here we are assuming that the value of Y is greater..without assessing the infor of Y, How can we assume that X >Y?
Please help..