First, that is *not* the Signal-to-Noise Ratio, but a raw number for "Noise". Second, the dropped connection does not signify accuracy, but does mean the numbers are at least indicative.
In fact, I doubt the radio actually measures noise at all. I can't remember the details, but I've seen something suggesting that it derives the supposed "noise" figure from the Signal strength by comparing that with the error rate. At any given signal strength, the bit error rate is, in theory, a function of the SNR, from which the "noise" can be derived if you measure the signal strength! Of course they aren't using the bit error rate either, but the errored packet rate, so it isn't anything like accurate.
But it does provide a very good *indication* of the conditions which affect your connection.
Depends... on what your signal strength is being reported as! And probably also on what rate the connection is too. I'd guess that a 20 dB or greater difference between signal and noise means the connection will be stable. I'd guess that as it gets down to less than 20 dB difference things begin to deteriorate rapidly... I'd bet at 10 dB the connection is down as much as up???
Jeff Liebermann as a trademarked Guess(tm) for these things, so maybe he'll pitch in with some accurate details and corrections to my simple guesses at what it all means.
Regardless, keep in mind that "noise" is everything other than signal. If you point your high gain antenna at the sun, you'll see more noise... but if you are using channel 9 you will have a lot of "noise" when the neighbors use channels 6 and 11.
Well, -75dBm seems way too high. A quick check of two laptops running Netstumbler show about -90dBm when I turn off all the local wireless junk. I've seen this before with laptops that generate considerable CPU and CCFL power supply noise. However, I've never investigated the exact cause or if it's really a problem.
The Prism 2 chipset gets its signal strength numbers by calculating a conversion table from the RSSI value. The noise numbers are apparently (not sure) from the RSSI heard between symbols. The access point will adjust the speed of the connection based upon the bit error rate (BER). 1 bit error in 10E6 is common. The BER is totally dependent on the S/N ratio. Therefore, the access point adjusts the speed for a constant S/N ratio. The required S/N ratio varies radically with the speed. High speeds need a cleaner signal than slower speed. Therefore, the S/N observed with Netstumbler should vary with connection speed. However, it doesn't.
What happens is that with 802.11b and compatible 802.11g, all management frames are sent at the slowest 802.11 speed of 1Mbit/sec. The Prism chipset uses only these management frames to determine signal strength, noise, and S/N ratio. In fact, the S/N ratio is broadcast in some management frames. That's why these chips are so slow in showing changes in signal strength. There aren't that many frames to work with. I don't know how it's done with 802.11g, but I also suspect it's with management frames at the slowest configured speed. I'll have to risk turning my brain to mush by reading
Disclaimer: I'm not very confident the above guesswork is correct and will post a correction if I find time to research how it really works.