dark fiber and sfp distance limitations

Kevin Hodle kevin.hodle at gmail.com
Sat Jan 2 17:36:12 CST 2010

On Fri, Jan 1, 2010 at 4:52 PM, Mike <mike-nanog at tiedyenetworks.com> wrote:
> I am looking at the possibility of leasing a ~70 mile run of fiber. I don't
> have access to any mid point section for regeneration purposes, and so I am
> wondering what the chances that a 120km rated SFP would be able to light the
> path and provide stable connectivity. There are a lot of unknowns including
> # of splices, condition of the cable, or the actual dispersion index or
> other properties (until we actually get closer to leasing it). Its spare
> telco fibers in the same cable binder they are using interoffice transport,
> but there are regen huts along the way so it works for them but may not for
> us, and 'finding out' is potentially expensive. How would someone
> experienced go about determining the feasibillity of this concept and what
> options might there be? Replies online or off would be appreciated.

I second the recommendation that you request OTDR traces from whomever
you are leasing the glass from, and further request traces for each
strand in *both* directions (a end to z end, z end to a end) at
multiple wavelengths, say 1530nm-1640nm at a maximum of 200GHz
wavelength spacing  to properly identify potential problem locations
in the future when you want to build out a  10GE metro DWDM solution
(You really do want to know about that old mechanical splice 20km into
your run, etc). An OTDR will provide you with granular loss/gain event
details for your entire span, while a power meter/light source will
only tell you your overall span loss.  While your fiber provider may
not pony up OTDR results until after you've executed the contract,
they should be able to give you a rough estimate of the total loss (in
dB for a 1550nm signal) for the span you are looking at leasing, and
you can build provisions into your contract that enforce an absolute
maximum loss on the span, in which case the provider will be forced to
take necessary actions to replace old poorly executed splices with
fusion splices, isolate and correct bends, etc. As most have pointed
out - EDFA should not be required for a 1GE single channel solution,
and probably would not be required for a simple 1GE CWDM setup either.
Once you graduate to an active 10GE DWDM solution EDFA's will be more
of a necessity (possibly with dispersion compensation, depending on
your vendor this may be an entirely separate shelf module or may be
build into the amp card). The addition of EDFA's in a multi-channel
solution also adds complexity (if you want to build a scalable/cost
effective solution). Most EDFA's have a maximum and minimum
per-channel input power, and ideally you would want to have each
channel near the same power level before hitting the EDFA. Depending
on your gear, topological complexity, etc this may require the use of
an optical spectrum analyzer to verify individual channel power levels
so the correct amount of attenuation can be added to each channel
before it hits the EDFA, however for a single point to point span this
will probably not be a concern.

 Kevin Hodle | http://www.linkedin.com/in/kevinhodle

 PGP Key ID  | fingerprint
 0x803F24BE  | 1094 FB06 837F 2FAB C86B E4BE 4680 3679 803F 24BE

"Elegance is not a dispensable luxury but a factor that decides
between success and failure. "
-Edsgar Dijkstra

More information about the NANOG mailing list