The specifications for various optical interfaces show a maximum and minimum transmitter power, differing by about 6-8dB. Are these supposed to represent variations in the actual transmitted power of new kit or are do they allow for degradation as the transmitter ages?
What I'm trying to work out is whether a working link, whose nominal loss exceeds the minimum loss budget, would be likely to stop working as transmitters age.
Background: I'm trying to work out whether devices such as the Allied Telesyn diplexers AT-WG103-13 or -15, which introduce up to 8dB loss, can be used to economise of fiber using technologies such as
1000base-LX/LH GBICs, OC3 intermediate reach PA-POS-OC33SMI and 10Gbase-LR Xenpaks. I know I can get a lot more loss budget by going to-ZX GBICs or -ER Xenpaks, but I'm looking at what I can do with existing kit.
Thanks,
Sam