Can macro level reports on broadband performance have policy impact?


Posted on April 5, 2017  /  0 Comments

Akamai has been publishing its State of the Internet report since 2008. They handle 15-30% of the world’s Internet traffic that positions them well to conduct diagnostics and provides a view on what goes on, on the web. In a recent review of all the reports produced to date, the adoption of broadband of varying speeds, i.e., unique IPv4 addresses that connect to Akamai at 256 Kbps or less, 4 Mbps, 10 Mbps and 15 Mbps. There is tremendous growth in Asia. But that is thanks to countries like Korea, Japan and Taiwan that have tremendous speeds, some of the fastest in the world.

Broadband quality, by nature of the medium used for transmission and the physics around it is volatile and is highly location specific (more here). So much so that quality can differ significantly from the ground floor to a higher elevation of the same building thanks to the laws of signal propagation. Yet, we pay for a service. And it is up to the service providers to optimise networks accordingly. In order to get to the crux of the lack of good quality broadband, we need to be able to measure, in the local context and using a replicable method.  This is where a regulator will ideally step in. In Sri Lanka we have this to some extent with the TRCs broadband measurement tool. In other countries in the region the measures have to be sent in quarterly by the operator and in the case of Philippines where performance was poor interventions were made through the Senate’s office (Akamai, being a credible source, was quoted in the case of Philippines). So while Akamai’s reports are useful at a macro level it really is ground data (and transparent methodologies) that is needed to nudge change that will impact consumers’ adoption.

Comments are closed.