bellman at nsc.liu.se
Mon Jan 23 19:29:05 UTC 2023
On 2023-01-23 19:08, I wrote:
> I get that for 1310 nm light, the doppler shift would be just under
> 0.07 nm, or 12.2 GHz:
> In the ITU C band, I get the doppler shift to be about 10.5 GHz (at
> channel 72, 197200 GHz or 1520.25 nm).
> These shifts are noticably less than typical grid widths used for
> DWDM (±50 GHz for the standard spacing), so it seems unlikely to me
> that the doppler shift would be a problem.
And as I was bicycling home, I of course thought of another aspect
of the doppler shift: the timing between the symbols in the signal,
or in other words the baud rate. There will be something like a
phase-locked loop (PLL) in the receiver in order to know when one
symbol ends and the next one starts, and that PLL can only deal
with a certain amount of baud rate shift.
But we can use the same formula. And in general, the doppler shift
for 16 km/s is about 53 parts per million. So e.g. a 112 Gbaud signal
would be received as 6 Mbaud faster or slower than it was sent at.
And here I have to confess that I don't know how generous typical
receiver PLL:s in network equipment are.
Another potential aspect might be the decoding of phase-shift keying,
i.e. when phase modulation is used for the signal. My *very*vague*
understanding is that the typical way to decode a phase-modulated
signal, is to mix the incoming signal with a reference carrier wave,
generated locally by the receiver, and the interference between the
two gives you the actual signal. But to do that, the reference must
have the same frequency as the received wave, and, I guess, must
match very closely. Can they adapt to an incoming wave that is 53 ppm
offset from what it should be?
Or have I misunderstood this? Analogue signals is very much *NOT*
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 836 bytes
Desc: OpenPGP digital signature
More information about the NANOG