Proving Gig Speed

Mark Tinka mark.tinka at
Wed Jul 18 13:01:06 UTC 2018

On 18/Jul/18 14:40, K. Scott Helms wrote:

> Agreed, and it's one of the fundamental problems that a speed test is
> (and can only) measure the speeds from point A to point B (often both
> inside the service provider's network) when the customer is concerned
> with traffic to and from point C off in someone else's network altogether.

In our market, most customers that put all their faith and slippers in
Ookla have no qualms with choosing a random speed test server on the
Ookla network, with no regard as to whether that server is on-net or
off-net for their ISP, how that server is maintained, how much bandwidth
capacity it has, how it was deployed, its hardware sources, how busy it
is, how much of its bandwidth it can actually exhaust, how traffic
routes to/from it, e.t.c.

Whatever the result, the speed test server or the Ookla network is NEVER
at fault. So now, an ISP in the African market has to explain why a
speed test server on some unknown network in Feira de Santana is
claiming that the customer is not getting what they paid for?

Then again, we all need reasons to wake up in the morning :-)...

>   It's one of the reasons that I think we have to get more comfortable
> and more collaborative with the CDN providers as well as the large
> sources of traffic.  Netflix, Youtube, and I'm sure others have their
> own consumer facing performance testing that is _much_ more applicable
> to most consumers as compared to the "normal" technician test and
> measurement approach or even the service assurance that you get from
> normal performance monitoring.  What I'd really like to see is a way
> to measure network performance from the CO/head end/PoP and also get
> consumer level reporting from these kinds of services.  If
> Google/Netflix/Amazon Video/$others would get on board with this idea
> it would make all our lives simpler.
> Providing individual users stats is nice, but if these guys really
> want to improve service it would be great to get aggregate reporting
> by ASN.  You can get a rough idea by looking at your overall graph
> from Google, but it's lacking a lot of detail and there's no simple
> way to compare that to a head end/CO test versus specific end users.

Personally, I don't think the content networks and CDN's should focus on
developing yet-another-speed-test-server, because then they are just
pushing the problem back to the ISP. I believe they should better spend
their time:

  * Delivering as-near-to 100% of all of their services to all regions,
    cities, data centres as they possibly can.

  * Providing tools for network operators as well as their consumers
    that are biased toward the expected quality of experience, rather
    than how fast their bandwidth is. A 5Gbps link full of packet loss
    does not a service make - but what does that translate into for the
    type of service the content network or CDN is delivering?


More information about the NANOG mailing list