Proving Gig Speed

Mark Tinka mark.tinka at seacom.mu
Thu Jul 19 15:19:55 UTC 2018


On 19/Jul/18 14:57, joel jaeggli wrote:

> There is a point beyond which the network ceases to be a serious
> imposition on what you are trying to do.
>
> When it gets there, it fades into the background as a utility function.

I've seen this to be the case when customers are used to buying large
capacity, i.e., 10Gbps, 20Gbps, 50Gbps, 120Gbps, e.t.c. Admittedly,
these tend to be service providers or super large enterprises, and there
is no way they are going to practically ask you to test their 50Gbps
delivery - mostly because it's physically onerous, and also because they
have some clue about speed tests not being any form of scientific measure.

The problem is with the customers that buy orders of magnitude less
than, say, 1Gbps. They will be interested in speed tests as a matter of
course. We notice that as the purchased capacity goes up, customers tend
to be less interested in speed tests. If anything, concern shifts to
more important metrics such as packet loss and latency.


> The fact that multiple streaming audio / video applications in a
> household doesn't have to routinely cheese people off point to the
> threshold having been reached for the those applications at least in
> fixed networks.

One angle of attack is to educate less savvy customers about bandwidth
being more about supporting multiple users on the network at the same
time all with smiles on their faces, than about it making things go
faster. I had to tell a customer, recently, that more bandwidth will
help increase application speed up to a certain point. After that, it's
about being able to add users without each individual user being
disadvantaged. Y'know, a case of 2 highway lanes running at 80km/hr vs.
25 highway lanes running at 80km/hr.


>  For others it will it still be a while. When that 5GB
> software update or a new purchased 25GB game takes 20 minutes to 
> deliver that's a delay between intent and action that the user or
> service operator might seek to minimize.

That's where the CDN's and content operators need to chip in and do
their part. The physics is the physics, and while I can (could) install
DownThemAll on my Firefox install to accelerate downloads, I don't have
those options when waiting for my PS4 to download that 25GB purchase, or
my iPhone to download that 5GB update.


>  Likewise, Latency or Jitter
> associated with network resource contention impacts real-time
> applications. When the network is sufficiently scaled / well behaved
> that these applications can coexist without imposition that stops being
> a point of contention.

All agreed there.

In our market, it's not a lack of backbone resources. It's that the
majority of online resources customers are trying to reach are
physically too far away.

Mark.



More information about the NANOG mailing list