The design of public transportation systems in large metropolitan areas illustrates how complex networks balance accessibility, cost, and speed. In highly populated cities such as New York and Tokyo, subway lines average 8–10 stations per mile, whereas newer, automobile-oriented cities like Phoenix may feature fewer than four. Yet the growth in station count is sub-linear: when a city’s population doubles, the number of stops typically rises by only about 60 percent. Transit planners therefore concentrate new stations in dense commercial corridors, where ridership and fare revenue justify the expense, while leaving outlying zones with wider stop spacing. The resulting pattern creates a network that expands with population but avoids the financial and scheduling penalties of building a perfectly uniform grid.
A comparable trade-off shapes biological neural networks. Across mammals, total neuron count increases with brain volume, but the average number of synaptic connections per neuron declines in larger brains. A mouse neuron might link to thousands of partners, whereas an elephant neuron connects to only a few hundred. This phenomenon, termed neural scaling, limits both the metabolic cost of maintaining long-range axons and the signal-propagation delays that would arise if every additional neuron established the same density of links. Computational studies show that such tapered connectivity lets big brains support greater absolute processing power without incurring an exponential rise in wiring length and energy demand.
Systems scientist Deborah Gordon argues that the two cases exemplify a universal principle of network optimization. Whether engineers map subway routes or neurons self-organize during development, each system must weigh the benefits of dense connectivity—short travel times, rapid information flow—against constraints of energy, space, and time. These trade-offs yield scaling laws that echo across domains: the most efficient large networks are neither maximally connected nor minimally built but instead occupy a mathematically predictable middle ground of “just-enough” links.
Which of the following statements is most strongly supported by the information in the passage?
A. The metabolic cost per neuron rises in larger mammalian brains because longer axons demand more energy.
B. As brain volume increases across mammals, the absolute number of neurons grows while the average number of synaptic connections per neuron declines.
C. Neural scaling in large brains eliminates the need for any long-range connections between neurons.
D. The reduction in connections per neuron necessarily slows information processing in larger brains.
E. Elephant neurons possess more synaptic connections than mouse neurons but require less energy per connection.