A certain characteristic in a large population has a distribution that is symmetric about the mean m. If 68 percent of the distribution lies within one standard deviation d of the mean, what percent of the distribution is less than m + d ?
This is easiest to solve with a bell-curve histogram. m here is equal to µ in the Gaussian normal distribution and thus m = 50% of the total population.
So, if 68% is one st.Dev, then on either side of m we have 68/2 = 34%. So, 34% are to the right and left of m (= 50%). In other words, our value m + d = 50 + 34 = 84% going from the mean m, to the right of the distribution
in the bell shaped histogram.. This means that 84% of the values are below m + d.
Like I said, doing it on a bell-curve histogram is much easier to fully "get" how this works, or you could apply GMAT percentile jargon/theory to it