Hi,
I'm working on providing information and training for users installing fibre optic systems in Broadcast facilities, and I've noticed an effect which I cannot explain. The systems use single mode fibres at
1330 nM, at a data rate of 2500Mbit/sec.- The receivers used have built in level metering
- When a loss is introduced in a controlled way - using an attenuator
- they work perfectly down to about their specified level (about 30uW)
- When there is a loss because of installation problems - dirt on lenses, incorrectly mated connectors, etc, - problems occur (problems = corrupted data) at much higher levels (100uW and higher).
Obviously these losses can be eliminated by cleaning and correctly seating the connectors, but I'm at a loss to give an explanation of what is going on. I'm only an engineer, I'm hoping there are some physicists here who really understand fibre optic systems, and can tell me what is happening.
W