in a general sense, standard deviation is "on average" how far the numbers deviate from the mean. anytime there is "on average" the number of values plays a part.
let's say there is a set of 999999 numbers. the mean is 5 and 999998 of the numbers are in fact 5. but 1 of the numbers is like - 50 and another is 60. the range is gigantic, however, the SD will be some really really really tiny number waaaaaay below 1 and near 0 because 999998 of the values don't deviate at all from the mean. (check out the equation for SD)
let's say there is another set of 2 numbers, 0 and 10. the mean is 5, the range is much smaller, but the SD is a whopping 5.
so when talking about range. it only tells you about two values. in a set you can have a billion numbers where only 2 are different, but are very different.
--- always think extreme and limits ---