On Thu, 19 April 2001, John Hawkinson wrote:
> The 5-minute average is not being sampled every five minutes.
>
> The raw number of octets is being sampled every five minutes, and
> divided by the time since the previous sample (5 minutes). Then,
> 95th percentile is taken of that.
How do you game the system? Economists like to write a lot of
papers on this subject.
The problem with calculating the average is the fencepost error.
Warning, majore simplifying assumptions ahead...
If you transfer 1GB for exactly 5 minutes, what is your 95% bill?
It depends on your timing.
00:00 Start transfer, Start measurement window
01:00
02:00
03:00
04:00
05:00 End transfer, End measurement window
Elapsed time 5 minutes, total bytes 1000000, or 3,333 Bps.
Bandwidth billed 3.3Kbps peak
00:00 Start measurement window #1
01:00
02:00
02:30 Start transfer #1
03:00
04:00
05:00 End measurement window #1 (500,000 Bytes, 300 Seconds)
Start measurement window #2
06:00
07:00
07:30 End transfer #1 (1,000,000 Bytes, 300 Seconds)
08:00
09:00
01:00 End measurement window #2 (500,000 Bytes, 300 Seconds)
Bandwidth billed 1.6Kbps peak
By knowing your provider's measurement windows, you could cut your
usage bill in half while transfering the same amount of data.
A customer with a relatively random usage (user's surfing the
web) couldn't do this, but a user transfering batch files on a
set schedule may see dramatic differences in their bill.
You can make the measurements more complex by using three windows,
or choosing three out of five windows, and so forth. Unless providers
or users are seeing real billing problems, there is little benefit
to the added complexity.