Request For Comment: Broadband QoSE testing Phase 2


Posted on May 23, 2008  /  3 Comments

LIRNEasia is ready to start Phase 2 of its Broadband QoSE testing methodology development.

In Phase 1 we developed AshokaTissa methodology and tested broadband performance of key players in India, Sri Lanka and (partially) Singapore. We used tools like Bandwidth Monitor, Ping and Tracert. The tests were done manually. Each link was tested for two weekdays and two weekend days. Since it was essential to take the averages of three readings it took 2-3 hours to take one set of readings – so only one link could be tested in a day.

Our tests measured six metrics. Throughput (download and upload), RTT, jitter, packet loss and availability. We tested these while downloading/uploading from (a) ISP server (b) local server other than ISP’s and (c) international server. The parameters were recorded six times per day from 08:00 hrs to 20:30 hrs. (More details of methodology and Test Results)

Phase 2 plans to avoid this tedious process of manual testing. We want to automate the process. We want to develop a software tool that does the same. This is how Rohan Samarajiva describes it:

In the first instance, we can install the software in the computers of a larger number of selected persons in multiple locations and collate the data.

What if, adapting the concept of public-resource computing, where complex computing tasks are broken up into small chunks that are then run in the background of large numbers of computers of volunteers that are simultaneously engaged in other tasks, this software is installed in thousands of computers that are connected to the Internet and run in the background while the host computers are doing other things?  And what if the results of these thousands, and even millions, of measurements are aggregated in real-time on a server, averaging out the various biases caused by computer idiosyncrasies and location-specific features?  This would take the quality of the results to a whole different level, averaging out anomalies and allowing continuous coverage.

And what if this real-time aggregate measurement of quality of service across a range of dimensions is available for all consumers to see on the web?  (Full document)

However, the ideal software tool we have in mind might not be the most practical. That is why we value your opinion on following issues.

Active monitor vs. Passive monitor: We can have only one. Active monitor actually downloads from predetermined sites and measure speeds while passive monitor measures speeds of ongoing user downloads. Both have pros and cons. Active monitor eats precious MBs – will be an issue if there is a limit in package, but passive monitor does not give us the breakdown – so we might not know where the bottleneck is. Our preference is active monitor.

Windows XP vs. Windows Vista versions: We can test the application for only one OS. We guess if we develop for Windows XP with luck it might work for Vista.

These are other key features (or constrains, depending on how one looks at them):

  • It measures all above metrics, except upload speed –which will be difficult to automate
  • The ‘user’ has to schedule the test times and make sure he/she does not run any other programs when tests are being conducted.
  • The user also has to upload the test results to our specified website. But this is easy and we expect a single user to do no more than two days testing, unless he/she volunteers to do more.

The important question: Will it be realist for us to find a reasonably large group of volunteers who could do this?

We invite all techies and broadband users out there to discuss/ciritcise/agree/disagree. We value your input. Thanks in advance.

3 Comments


  1. You can design for Windows XP, Vista and Linux in one go, by designing it in a cross platform runtime environment such as Adobe Air and bundling the RE with the program. I personally prefer Adobe Air cause the GUI looks pretty.

    The other main thing is it’s possible to automate uploads and downloads from a mail server. And it’s also possible to automate the uploading of the statstics to the website without asking the attention of the user.

    It would be ideal if the program could be designed to a “single-click” mechanism, where everything is automated and the user just has to watch the tests being done. Minimizing the hassle needed to run it from the user-end will obviously depend on the success of the crowdsourcing exercise.

    P.S. The new webdesign sucks. I”m running on a 1680*1050 resolution and everything looks stretched out and bulky.

  2. sorry it should be:

    the success of the crowdsourcing exercise will depend on the minimizing the hassle needed to run it from the user-end will

  3. FYI – http://www.bmighty.com/blog/antenna/archives/2008/06/google_developi.html?cid=antenna

    “Google plans to provide tools to let individual users of broadband services analyze how their providers are managing traffic, so they can object to traffic discrimination if they find it.

    Speaking at a panel discussion of net neutrality issues, Google senior policy director Richard Whitt said that neutral networks were vital to innovation and that individual network users should make their views known on the subject. “If the broadband providers aren’t going to tell you exactly what’s happening on their networks,” Whitt said, “we want to give users the power to find out for themselves….We’re trying to develop software tools…that allow people to detect what’s happening with their broadband connections, so they can let [ISPs] know that they’re not happy with what they’re getting — that they think certain services are being tampered with.”

    Whitt wouldn’t say when these tools would be available – or how they will operate.”