Data flood/tsunami/avalanche: Whatever the name, the problem is real


Posted on December 30, 2011  /  0 Comments

We’ve been talking about the qualitative increase in data volumes that will result from the conversion of mobile networks into carriers of data since 2010. Is it a flood, a tsunami or an avalanche? The name does not seem to matter (though tsunami is the term that seems to be catching). Unless the problem is understood (operators do; some regulators and policy makers do, as evidenced below); and addressed (both in terms of access networks, as below, and in terms of backhaul, as we have been advocating), the quality of broadband experience will degrade radically.

The announcement comes as wireless companies are facing a spectrum crunch crisis that has already begun to reshape the industry.

As smartphones and tablet sales have soared over the past several years, consumers’ demand for data has grown exponentially. All that data is taking up a growing amount of spectrum, or light waves, and carriers are simply running out of airwaves to cram data into. The FCC has said that a current spectrum surplus of 225 MHz will become a deficit of 275 MHz by 2014 (see chart above).

That’s why the FCC is committing to freeing up 500 Megahertz of spectrum over the next decade. But there’s a catch: That process includes voluntary auctions by a patchwork of television stations across the country that currently hold but aren’t using their spectrum. Many aren’t willing to give it up.

Comments are closed.