Saurav Arora wrote:
I think this is a high-quality question and the explanation isn't clear enough, please elaborate. Consider numbers from 1 to 1000 written as follows:
1. 0001
2. 0002
3. 0003
...
1000. 1000
We still have 1000 numbers. However, we used 4 digits per number, hence used total of 4∗1000=4000 digits.
4000/10=400 times.
What's wrong with the above inference?
Thanks.
It doesn't work because you have 4 digits (and aren't accounting for 1001 to 9999). The concept applies as follows:
1 digit (0 to 9): 10 numbers, 1 digit per number = 10 digits --> 10/(10 digits) = each digit used 1 time
2 digits (00 to 99): 10*10=100 numbers, 2 digits per number = 200 digits --> 200/(10 digits) = each digit used 20 times
3 digits (000 to 999): 10*10*10=1000 numbers, 3 digits per number = 3000 digits --> 3000/(10 digits) = each digit used 300 times
4 digits (0000 to 9999): 10*10*10*10=10000 numbers, 4 digits per number = 40000 digits --> 40000/(10 digits) = each digit used 4000 times
..
n digits: 10^n numbers, n digits per number = n*10^n digits --> (n*10^n)/10 = each digit used n*10^(n-1) times