Ok, let's start with the receiver only. If you have a receiver with about -85dBm sensitivity (number selected arbitrarily), and you have a long length of coax cable with 6dB of loss, your receiver sensitivity at the end antenna end of this coax will be -79dBm. You're range will be half of what it would be if the receiver were located next to the antenna with no coax loss.
Now, if we install a perfect receive amplifier at the antenna, with a perfect 0dB NF (noise figure) and perhaps 15dB of gain (also selected arbitrarily), your receiver sensitivity will be back to the previous
-85dBm sensitivity and your range will be twice as good as with the coax. By adding an amplifier, the receive sensitivity can never be better than the basic sensitivity of the receiver. In effect, all the amplifier can do is compensate for the effects of the coax cable loss.
However, there's no such thing as a free lunch or free gain. In order to insure that the receiver sensitivity is determined by the added RF amplifier, the amplifier gain has to be at least 10dB or more. 12 to
15dB is fairly typical for rx amplifiers. However, that also reduces the dynamic range of the receiver front end by: 15dB gain - 6dB coax loss = 9dB The bottom of the dynamic range is the receiver noise floor and that's not going to change much with an added amplifier. However the top end is not that great on most commodity receivers. Making it 9dB worse is going to make the receiver much more susceptible to interference problems, overload from strong signals, and possibly blocking from adjacent services.
The right design for a receiver amplifier is to have only a little more gain than the losses. Losses include everything betweent the receiver front end including diversity switches, pigtails, coax adapters, the coax cable, and whatever I forgot. Add about 1dB for the NF of the amplifier. Most of this is in the coax cable so the rest can be just estimated. So, for 6dB of coax, the rx amplifier should have no more than about 8-10dB of gain. Any more doesn't improve the sensitivity and just reduces the dynamic range.
If you have a very short length of coax cable between the receiver and the antenna, an rx amplifier is a total waste of effort and is seriously detrimental to the dynamic range.
It's just as bad for the tx RF amplifier, but that can wait until after dinner. Stay tuned. (Quiz on this tomorrow).
On Fri, 30 Jun 2006 00:13:58 +0000 (UTC), JayJay wrote in :
A good receive amp (pre-amp) might help if the radio doesn't have enough sensitivity on its own, but that's rarely a problem in my experience. In general, an amp can add noise, but not remove it -- thus if S/N is too low (rather than just too weak a signal), amplifying it won't help.
I've not been able to find any good reviews of such products that demonstrate any real benefit on receiving. For example, the review at
mostly looks at the output side as measured by a Wi-Fi PC Card, and even those results were inconsistent. While the throughput tests showed some modest benefit at maximum range, that probably wasn't due to the receive amplification.
This is part of why I think a better antenna is much more likely to be helpful than a signal booster. It's also likely to be less expensive, and won't necessarily be less convenient; e.g., Linksys HGA7S, 7 dBi screw-on replacement for 2 dBi "rubber duck" with about 2-3x the range.
OK, on to the transmit power amplifier. I've previously demonstrated that the receiver sensitivity is NOT going to improve with the addition of an RX ampflifier. At best, it will eliminate the effects of coax cable losses.
However, the transmitter power amplifier will certainly increase the signal level. This is commonly known as an alligator, which is an animal with a big mouth and small ears. The xmit amplified access point can be heard over a much larger area than it can hear the replies from the clients. Unless the client radios have a similar power amplifier, the system become asymmetrical, with more range in one direction than the other.
For example, the typical wireless access point delivers +15dBm to the antenna. Add on power amplifiers output from 250mw (+24dbm) to 1 watt (+30dbm). Assuming 1 watt output, that's a gain of 15db. Since range doubles for every 6dB of signal increase, this 15dB gain is good for: 10^(dB/20) = 5.6 times the range of the unamplfied xmitter. If plugged into an omni antenna, that's 31.4 times the coverage area where the transmitter can be heard, but it can't hear any of the laptops and low powered clients. In effect, this makes an amplified xmitter no better than a jammer.
There are also timing problems. The power amplifier has an input RF detector that senses when the access point transmits, and switches the power amplifier from receive to xmit (and back again). It takes a finite amount of time to detect and switch, during which time, the access point is simply not functioning. If it doesn't mangle the inter-symbol interference, it will trash the preamble. Truncating the preamble is only a problem with some diversity systems, which use the preamble to measure the S/N ratio.
There's also the power amp drive level problem. Only a few power amplifiers have AGC (automatic gain control). This makes them largely insensitive to input drive level. Having exactly the right drive, in the linear region of the power amplifier, is manditory or there will be envelope (amplitude) distortion. The AGC takes care of adjusting the level, but those without AGC must have the exact specified length of lossy coax cable between the access point and the amplifier, or they simply won't work. Fortunately, most power amps have AGC these daze.
A power amplifier improves the signal in one direction only. An antenna improves the signal in both directions. Also, 15dB of xmit power amplifier gain costs $150 to $250. The same 15dB of antenna gain costs about $50.
Bottom line: always put your money into the antenna.
The overall link is limited by the laptop WiFi card. Poor receive sensitivity and low TX power. Upping the AP's TX power won't help the overall link, only the outbound direction. Adding a pre-amp to the AP's receiver input won't improve its noise figure significantly and could cause degraded intercept point, wrecking its performance. The single exception to the "no pre-amp" rule is when the pre-amp is mounted right at the antenna driving a long coax to the AP.
But a better AP antenna will increase effective radiated power outbound and improve receive sensitivity inbound.
Midway? No way. Both the rx and tx amplifiers work best at the antenna. The rx amp cannot compensate for the loss of the coax cable between the antenna and the amplifier is placed midspan. When located at the antenna, all the coax loss is compensated for by the amplifier, not just the piece between the amp and the receiver.
Where did you get the idea midspan was best? The amp goes at the antenna. I just re-read my rants and I never even mentioned midspan installations.
What I've seen sometimes is putting the amplifier at the radio and then running a long length of lossy coax to the antenna. That's the worst possible location. The RF output of the tx power amplifier is largely lost in the coax cable losses. The receive amplifier, located at the wrong end of the same coax cable, can't compensate for the coax loss. Minimal benifit in xmit and none in receive. Usually, someone justifies this location on the basis of "too much weight topside", "lighning protection", or some such rot.