Interface error_rates question
dholmes at aprisma.com
Thu Jan 3 13:55:13 UTC 2002
I'm looking for some real user input from network operators:
To what degree would you want to be able to measure error_rates on device interfaces.
Many tools currently calculate 1% and above (so if you have .9% it is displayed as 0%) but it is quite possible users may want to measure fractional percentages as well.
Does anyone have any opinions/preferences based on your current experience?
Does it matter the type of interface you are managing (Ethernet, serial, etc.)?
Thanks - Dan Holmes
More information about the NANOG