if the mean of a data set is 75 and the standard deviation is 10, what is the range of scores that fall within one standard deviation of the mean?
what I meant is that the range of scores within one standard deviation from the mean, which is 75, falls between 85 and 65.
got the 85 by adding 10 to 75
got 65 by subtracting 10 from 75
this range falls within one SD from the mean of 75.
if you want to get the range of scores that fall within 2SD from the mean is by:
adding 20, to 75 ,which results in 95
subtracting 20 from 75, which results in 55.
so the range of scores that falls between 2 SD from the mean is scores from 55 to 95.
so, you just need to add SD to the mean, and subtract the SD from the mean in order to get the range of scores that falls within one standard deviation
hope this helps,
Sky is the limit