We have been having an on-going battle with several local telcos over DSL line quality. For some reason their testers always show a much higher line quality than does the routers.
For example, I have taken the exact same DSL POTS cable, and plugged it into a Cisco 827, 837, and 877 router and got essentially the same line quality stats. When the local telco test the line (usually using a SunSet MTT test set), they consistently see a good quality line, where the routers see a marginal line -- one that keeps dropping. (And this is not just a single line at a single location -- we have the same problem at multiple locations, and at some locations, on multiple lines at that location.)
For example, here is what the router reports: ATU-R (DS) ATU-C (US) Capacity Used: 98% 53% Noise Margin: 5.0 dB 12.0 dB Output Power: 17.0 dBm 8.0 dBm Attenuation: 64.0 dB 31.5 dB Interleave Fast Interleave Fast Speed (kbps): 1216 0256 0
and the test set reports:
Capacity: 47% / 40% SNR: 8.5dB / 15dB Attenuation: 40dB / 28dB kbps: 1472 / 256 (noise profile)
Why such a substantial disagreement between telco test sets and Cisco routers? Especially when there is no difference between the wiring to the device, up to and including the cable plugged into the device.
This is getting to be a real pain. We have flaky connections and numerous drops, yet the telco says everything is fantastic. It just doesn't make sense.
TIA for any insights into this problem.