I've just set up my first Wi-Fi network, or rather Wi-Fi to an 802.11g router for my cable Internet connection. The D-Link DWL-G120 USB device I'm using on this computer comes with a Wireless Utility that shows signal strength and link quality. I don't really understand what "link quality" means, and their very brief explanation isn't much help. Neither have I been able to find anything useful on Google, though I suppose there must be plenty of good information on the Internet somewhere.
Can someone explain link quality to me, or point me in the direction of that information?
Sure. Link quality is the Signal to Noise Ratio (SNR). The card measures the signal strength and the noise level. It calculates: SNR = 10 * log [(S + N) / N] or some similar calculation. Actually it's a bit more complex in that the signal and noise levels are derived from the RSSI (received signal strength indicator) which is a value from 0 to 255. The manufactory works out a table of RSSI to signal level in -dBm (decibels relative to 1 milliwatt into 50 ohms). In this case, the SNR is roughly the difference between the signal level and the noise level in -dBm.
Most of the noise that a receive sees is from other 2.4GHz transmitters including other Wi-Fi networks. It is possible to have a rather strong signal, but an unuseable connection because of a high noise level. This is a poor SNR.
The receiver detector and data decoder require a minimum SNR in order to maintain a reasonable BER (bit error rate). The access point adjusts the speed in order to maintain a high SNR. The faster the connection speed, the higher the required SNR. For example, the minumum SNR in order to maintain a 10^5 BER is: Speed SNR(dB) 11 6.99 5.5 5.98 2 1.59 1 -2.92 54 24.6 48 24.1 36 18.8 24 17.0 18 10.8 12 9.0 9 7.8 6 6.0
If you look at the 1Mbits/sec line, it really does mean that at
1Mbit/sec, you can theoretically have twice as much noise power as signal, and still decode the data.
Jeff, thanks a million. You've also helped me understand the dBm numbers that show in the Available Network section of this Wireless Utility.
I am picking up (at this moment) four other networks besides my own. Three have an SSID of "2WIRE(something)" so I suppose "2WIRE" is a hardware name, since my default SSID when I set it up was "dlink." The fourth is a person's name and "house," and looking up that name on Google to get the address in my town, then checking the online map, I see that the house in question is a mile or so away from me. That seems a long way, as I understood the usual range for Wi-Fi was a few hundred feet. I suppose that implies he's using some sort of range extender and/or extra antenna?
Also, I notice he's using Channel 1. I'm still using the default Channel 6, as are the other three networks I'm getting a signal from. Are there pros or cons to switching to a different channel?
are mostly preconfigured wireless routers distributed by various DSL ISP's as their "home networking" packages. They are delivered preconfigured with the SSID of 2wireXXX where XXX are the last few digits of the MAC address.
You might want to change that.
Possibly. Netstumbler does not require many packets in order to identify a wireless access point. For example, I can see access points on the hillsides across the valley perhaps 3-5 miles away. However, when I try to connect to these wireless routers, the connection fails, usually because of interference or low signal strength. The 300ft maximum outdoor range often quoted in the literature is for a "typical" installation, using the stock antennas, a typical laptop, no interference, line of sight, and no complications. It is often less. A mile is not unusual if you have a decent antenna on your sniffer, or the access point has a high gain antenna, power amplifier, perfectly clear line of sight, or plenty of altitude.
Yes. The Wi-Fi signal is about 22MHz wide. That's a bit more than 4 channels worth of occupied bandwidth. Therefore, the only non-overlapping channels available are 1, 6, and 11. If you select a channel in between, you get to hear the interference from TWO of these non-overlapping channels.
I recently did some war walking with my PDA in the downtown Santa Cruz CA area. Almost everything I heard was on either Ch 6 and Ch 11. Nobody was on Ch 1. I thought my PDA was broken, but subsequent testing with my laptop demonstrated that Ch 1 was unoccupied. I should have known better. When I attempted to use Ch 1, I discovered why everyone else had abandoned it. There was a VERY strong FM signal on Ch 1 in the area. I eventually tracked it down to some fool attaching an illegal power amplifier to their 2.4Ghz cordless phone and using it like a cell phone in the downtown area. They're supposidly moving out shortly, so I haven't persued the problem. The moral is that just because a sniffer says the channel is unoccupied, doesn't mean that it's useable.
How did you identify the source of the interference? In my neighborhood, there are APs on channels 6 and 11; therefore, I tried channel 1. I couldn't make a stable connection due to some source of interference. Channel 2 was better, but only when I tried 3 did I get stability. Kismet doesn't find anything on channel 1.
I have three spectrum analyzers available. I used the first two:
A modified MMDF downconverter to a borrowed Tek something spectrum analyzer. The analyzer only goes up to 1GHz. The MMDF downconverter mixes the 2.4GHz Wi-Fi band down to about 200MHz. I was doing a site survey on a local rooftop when the cordless phone signal magically appeared and almost overloaded the receiver. The hotel where it was located was almost directly across the mall from where I was working. I had some trouble convincing the desk clerk to let me cruise the hallways. I found the floor and probable room with Wi-Spy. The couple knew exactly what they were doing but were unaware that it was causing Wi-Fi interference. They agreed to limit its use until they move out at the end of Nov.
A big, heavy, ugly, but cheap HP141 mainframe with various spectrum analyzer plugins. I use this at my house for bench testing and when I can't borrow something lighter.
to some source of
stability. Kismet doesn't find
Kismet will only find 802.11 signals. For Non-802.11 transmitters, you need a spectrum analyzer. See the FAQ:
a shopping list of interference sources. Note that most cordless phones congregate near the bottom of the 2.4GHz band. The algorithm seems to be start at the low end and work their way up until they find a useable channel. I'm not sure about this but that's how several
2.4GHz phones I tried seem to work. Try to get the timing of the interference. I can usually make a good guess by the usage pattern and hours of operation.
I'm not sure what you mean by "sniffer." My antenna at this computer is just a small rectangular frame on the USB device -- I presume it's a loop antenna. I know I don't have a perfectly clear line of sight. I'm in an apartment house with a steel frame, and there are at the least a lot of trees between me and that address. It is only slightly hilly around here.
Ah, I see.
The shown signal strength from my own router/AP in the next room is -30 dBm. How does that compare with the usual sort of network inside a house? The utility generally shows my link quality at 100%, and 54 Mbps. I notice that when I move the antenna around, the link quality drops temporarily, then recovers and returns to 100%. Occasionally the Tx rate drops to 24 Mbps briefly -- I presume that makes no difference since I'm only using this to connect to my cable modem, which has less throughput than that anyway, right?
The four 2WIRExxx signals I'm picking up right now are -70, -86, -78 and -90. What I'm getting from the house a mile away is -94. I understand dB is on some sort of logarithmic scale, so those numbers aren't really as close to my own AP as they look, correct? What would be a reasonable minimum signal strength for a reliable link? And do those numbers suggest that at
*their* end they're getting about the same signal strength from me, or isn't that necessarily so?
I'm not really terribly concerned about being hacked, but I'm not carefree about it either. I do have WEP on, of course.
Anything that can be used to located access points, clients, interference, noise sources, transmitters, etc. It's a generic term I borrowed from ham radio transmitter hunting.
Nope. Most USB devices have a 1/4 wave meandering line monopole on a ceramic substrate. The ceramic has a high dielectric constant which drastically shorten the required length (and size). There are limitations to what can be done with a corner reflector, but they work well enough for general purpose transmitter hunting. I kinda prefer my salad bowl dish idea:
it's not a perfect parabola and also tends to attract too much attention from the authorities. For real site surveys, I use a
30ft telescoping fiberglass window washing pole with a 15dBi dish and an MMDS downconverter on top.
The real problem with not having line of sight is reliability. You can usually find a spot that works. The problem is that it will not remain working as objects along the line of sight move around. There's also the problems of Fresnel Zone diffraction and reflections off the ground and objects.
-30dBm is a rather strong signal for wireless. That should work as long as you don't have a bunch of reflections.
The drop in link quality was cause by multipath or reflections. There will be places in the room where the incident (direct) signal cancels the reflected signal, thus reducing the SNR. There's also the problem with inter symbol interference, where the reflected signal arrives somewhat after the incident signal and clobbers the next arriving packet. 802.11g has a rather large inter-symbol delay to take care of this very real problem, but it's not large enough to handle all possible reflections.
That's going to be a problem as there is no single answer. The signal levels in -dBm you mention have different effects at different speeds. The following chart is the receiver sensitivity at various speeds. |
So, if you have a -70dBm signal, you could theoretically no faster than 36Mbits/sec (for a thruput of about half that or 18Mbits/sec). Unfortunately, this is the *BEST* case approximation. Add a bit of interference, some noise, a few reflections, and some marginal hardware, and you'll probably end up at 24 or 18Mbits/sec. It's also not very useful running at a PER of 10% as you will see substantial retransmissions. However, we'll pretend that everything is perfect so I won't complicate the estimates. Your -94dBm signal level will not even work at 1Mbit/sec. I suggest you read the following sections on fade margin and range/speed. |
may not make much sense at this point, but is as simple as I can make it and still include the calculations.
Also, it might be useful to know that a 6dB increase in power or sensitivity will yield a doubling in range. 12dB is 4 times, 18dB 8 times. If you look at the above table, the difference in dB between
54Mbits/sec and 1Mbit/sec is 21dB. That means you can go 16 times as far at 1Mbit/sec than at 54Mbits/sec. You can also use this to predict your range. If you can go 100ft at 1Mbit/sec, you'll only be able to go 6.3ft at 54Mbits/sec.
As for how fast you need to go to run your cable modem, I would need to know your cable modem speed. I'll assume 6Mbit/sec. You'll need at least a 12Mbit/sec wireless connection to equal this. However, you will have reflections, interference, and the attendent retransmissions, so something somewhat faster would be the required minimum speed. Methinks 18Mbits/sec would be a safe minimum. Looking at the table, your recevie signal level will need to be over -82dBm or you will be slower than your cable modem.
Nobody worries about security (and backups) until AFTER they have been hacked or had their computer trashed.
Jeff, thanks a million again for all your help and information. I've learned more in a few minutes from your posts than in many times longer reading Que's Absolute Beginner's Guide to Wi-Fi. I'm saving all of it to a file.
Yes, it's MMDS, not MMDF. MMDF is "Multichannel Memorandum Distribution Facility" which I use for a mailer on my antique SCO Unix box. Sorry about the confusion.
Run some bench tests on the MMDS downconverter in the 2400-2483.5MHz region. Some of them (i.e. Pacific Microwave) have notch filters in this region to help eliminate microwave oven and other sources of interference.