I posted an explanation of why the first statement is sufficient a while back on another forum, so I'll just paste that here:
There are quite a few ways to look at this problem, though it's actually based on an idea we encounter all the time in day to day life. I'll suggest two approaches:
If you know the concept of 'odds' from probability (and from real life), then the first statement is clearly sufficient here. If you say that "the odds are 21 to 5 in favor of picking a consonant when you pick a random letter from the alphabet", that is just a ratio of the number of ways to pick a consonant to the number of ways to not pick a consonant. From those odds, we can see that the probability of picking a consonant is 21/26, and the probability of picking a vowel is 5/26.
That's what statement 1 is telling us: the odds of getting a red marble are better than the odds of getting a white marble (the fraction on the left is the ratio of red marbles to non-red marbles, while the fraction on the right is the ratio of white marbles to non-white marbles). If the odds are higher, the probability must be higher.
Alternatively, you could look at this abstractly. We don't need to rewrite the fractions at all. We know that:
r/(b + w) > w/(b + r)
and all of the letters represent positive quantities. We also know that if a fraction has *both* a larger numerator and a smaller denominator than another, it must be larger in value. Looking at the above inequality, it's impossible for w to be greater than r; if it were, then the fraction on the right would have both a larger numerator and a smaller denominator than the fraction on the left. So r must be greater than w (they can't be equal, if that inequality is true), which is all we need to know here.