Imagine that you have two jobs. At one job, you work for two days a week at $60 an hour, and at the other job, you work for three days a week at $40 an hour. You'll make a certain amount of money per week, and you can also work out the average amount that you make per hour over the whole week. After one week, you'll have made a certain average number of dollars per hour (somewhere between 40 and 60). After two weeks, that average will be the same, as long as your hours stay the same. After three weeks, same thing. You won't suddenly start making more or less money on average, no matter how many weeks you work for in total, unless your hours change.
In this analogy, the $60/hour job corresponds to walking faster, and the $40/hour job corresponds to walking slower. As long as the amount of time you spend walking faster and the amount of time you spend walking slower stay in the same proportion to each other, your average speed will always stay the same. It doesn't matter whether you walk for a total of one minute, ten minutes, or a week. What matters is that out of every minute (or hour, or whatever), you spend a fixed part of it walking more slowly on average, and a fixed part of it walking more quickly.
A second aspect of this problem that's hard to understand intuitively, is why the result comes out closer to 4mph than to 6mph. It seems like it
should be 5mph, logically. The reason it isn't is because you actually spend significantly more time walking at 4mph, so the 4mph affects the average more. To take it to an extreme, imagine taking a 3,000 mile trip in an airplane going one way at 500mph, then getting off the plane and returning on foot at 3mph. You couldn't logically say that you averaged (3000+3)/2 = 1501.5 mph, because that would be a very fast average speed. In fact, your average speed would be very slow, which makes intuitive sense, since the entire trip would clearly take a very long time.
_________________