bits/hz/second: we're barely more efficient than the telegraph (Re: TransAtlantic 40 Gig Waves
tkapela at gmail.com
Mon Aug 17 16:58:12 CDT 2009
I'll comment on both:
On Mon, Aug 17, 2009 at 12:14 PM, Rod Beck<Rod.Beck at hiberniaatlantic.com> wrote:
> Rod, do you know if the 40G waves increased the spectrum efficiency of
> your fiber? On land systems they pretty much break even, i.e. you can
[rod beck replies]
> The enabling technology is based on advanced encoding techniques allowing a greater rate of symbol transfer."
Looking back in Google and other IEEE papers, the previous 20 years of
interfacing our abstract "bits" to the real world via photons hasn't
been met with terribly high efficiencies, though we certainly have
seen both great progress in the transmitter (who could have imagined a
VCSEL in 1985?) and receiver technology, and of course a significant
improvement in usable bits/sec.
I can only wonder what the curve of optical spectral efficiency we
achieve over the next decades will resemble. Perhaps we'll have to
wait for a "Shannon of Optics" to stand up (or quit their day job at
whatever modern-day version of $bell_labs they're stuck working for)
and point out something obvious we're missing.
A bit of sobering reality I often consider is we waited roughly a
century to progress from where Marconi began to the present day, where
we have cheap radios doing 12 bits/hz/sec costing about $20k (a pair).
Clearly a key difference is that people are paying (allot,
propotionately) to communicate $stuff and folks value networks more
than they did previously - so we're not in the same position folks
were pre-1900's, struggling to find a market for their crazy wires
across the sea.
For the experts out there: how long are we going to wait for something
more efficient than morse code over twisted pairs?
More information about the NANOG