Good idea. That would be for a R-SMA connector. Methinks making an extension cable for the insipid stock 1/4 wave antenna supplied with the usual PCI wireless card is a waste of money. Indoor omnis have lots of reflection problems. The very tiny coax usually involved in extensions is rather lossy.
Instead, I suggest building or buying a proper antenna (patch, panel, biquad, whatever), using some decent coax (LMR-240) and terminating with either an R-SMA or something else (N connector) with an adapter to R-SMA.
problem is that for all the gain the antenna provides, the loss in the coax cable makes it less than useful. For example, this antenna claims 6dBi gain, which is about right for the antenna alone: |
the 3m of RG-316 cable contributes -6.6dB of loss, resulting in a net loss of about -1dB. With heavier coax, it would probably be a tolerable antenna, but not with the tiny RG-316 stuff. Bad idea.
[Insert generic rant about misleading antenna specifications.]
Does anyone know of a source to buy a linksys antenna extension cable. I'd like to use the cable with the PCI card to move the antenna away from the back of the computer. There is such a mass of wires at that location that it seems that I'd get better connectivity with the antenna several feet away. I've googled and found a few sources but the prices are in the $40 range for 6 or so feet of cable with ends.
Nicely done. 30ft is way too far for tiny coax. At 2.4Ghz: RG-174 (polyethylene) 1.2 dB/metre RG-316 (teflon) 1.1 dB/meter RG-58a/u (cheapernet) 1.1 dB/metre RG-58c/u (polyethylene) 0.90 dB/meter RG-58/u (foam) 0.57 dB/meter RG-6/u (foam sat) 0.50 dB/meter (75 ohms) LMR-400 (foam) 0.23 dB/meter LDF4/50A (foam Heliax) 0.13 dB/meter LDF5/50A (foam Heliax) 0.08 dB/meter
I've been experimenting with using 75 ohm RG-6/U for antennas. It's fairly easy to match build a 1/4 wave matching section and coax adapter. The satellite grade RG-6/u coax is cheap and the conectors are cheap. So far, losses look really low. Maximum mismatch loss going from 75 to 50 ohms is 0.18dB at 1.5:1 VSWR which is not even worth throwing into the loss calcs.
Did you see the phote of what's inside my formerly favorite antenna?
's suppose to have 9dBi of gain. It's basically a full wave loop over a roughly half wave reflector. Basically, half a biquad. Frankly I'm disappointed but it does work well.
I refer to it as "garden hose". The Heliax is pricy, but the connectors are outrageous. I have a small collection of recycled connectors for 7/8" Heliax but they aren't going to last forever. When it was raining last month, one of the FM translators in a mountain top radio site was showing ever increasing VSWR. Nothing horrible, but not very stable. So, I unscrewed the Heliax connector, poured out what I estimate to be about 200ml of water, blew out some more water with my air compressor, poured a handful of rice into the Heliax, reasembled the connector, and put it back in service. I'll be up there tomorrow to extract the rice. If it's green, the copper has oxidized and the Heliax will need replacement. If the rice is just dirty and wet, the Heliax will be fine.
I gave him a similar list, normalized for 3 dB of loss by the number of feet required, for several types of coax. And then basically suggested a suitable maximum length for each type, equating to roughly 1 dB of loss. Hence, RG-174 takes only 5 feet to lose 3 dB of signal, and I wouldn't want to use anything longer than maybe 15-20 inches or so.
For a 30 foot run, LDF5/50A would come in at just under 1 dB.
I haven't priced it, but I bet even one N connector for that stuff is expensive!
(Compared to 802.11 wireless radios, that is. All I've ever used it with was satellite earth station equipment, and for that a simple 1.5 Mbps T1 involves a $30K modem, so we looked at heliax connectors as cheap. It's all relative.)
Most commercial applications want to keep it lower, but in reality, it just doesn't add significantly to the loss for short runs.
Keep in mind though that it adds to the loss per meter, so the effect becomes significant with long runs.
Looks like pure FM to me...
We used to have problems with water leaking through the windows on earth station feed horns. (Now all replaced with a different type to get a narrower beam width.) We carried 2000 watt "hair dryrer" heaters with seriously large blower fans. Open up the waveguide, blow air through it for a few hours, repair the leak, and all was fine.
It's amazing now much water can get through a pinhole leak, never mind when a raven rips the whole cover off.
I just found it via google. Oddly, I couldn't find this device myself on the
I took a trip to Best Buy and discovered that the device that would most meet my needs was not the one I linked above but one meant for 1 antenna and the correct connector. (same price) See here:
after looking around I ended up buying this guy:
gives me the extension away from the back of the computer and is a "gain" directional antenna. $12 more than the Linksys device) I did a bunch of testing before I installed it and after and here is what my statistics are: Using my Linksys WLAN monitor... with the original antenna mounted directly to the back of the computer: Noise -92 dbm Signal -74 dbm With the D-Link antenna Noise -98 / 99 dbm Signal -65 / 64 dbm So if I'm interpreting the numbers correctly it looks like this device has improved my connection. Less noise, more signal. The only conflict is that the WLAN monitor "site survey" reports my wireless network at 58% whereas before it said 61%. I don't know what that means. % of what? The "bar" indicator is now floating between 6 and 7 and before it was 5. When I watch the "noise" numbers they do drift to as high as -37 dbm for a short time but then return to 98 / 99. So something is going on nearby. I do have 2 wireless 2.4 GHz phones in the house. Also I can now see a neighbors WiFi network when I could not before from this computer. My laptop could always see it but not this desktop WinMe machine. Too bad Netstumbler won't run on WinMe or I could have gotten better and more fluid information. Bottom line, more signal - less noise. I'm wondering which of those 2 improvements is more important?
That's almost exactly the same as the Hawkins antenna I was complaining about. However, DLink did recognize the problem and shortened the coax cable. 6dBi gain for the antenna. 1.5 meters of RG-316 coax at 1.1dB/meter for 1.7dB loss. Net gain is 6-1.7 = 4.3dB which would consitute an improvement. To put things in perspective,
6dB gain is equal to double the range (all other things being equal).
Sounds about right. -74dBm to -64dBm is 10dB of gain which is more than the expected 4.3dB. However, the location of the antenna is what really changed so the 10dB difference isn't all due to the antenna.
Well, it's a bit messy. The %'s are the percent of the RSSI (receive signal strength) reading and the S/N (signal to noise) ratio readings. Usually, there are two scales presented. The internal 0-255 values are converted to 0-100 for human consumption. When only one value is given, it's often a composite conglomeration of signal strength and S/N ratio resulting in some kind of "signal quality" number. I'm not sure what your unspecified model Linksys PCI wireless card uses.
The S/N ratio or "noise level" numbers come in two flavors. One is the actual S/N ratio calculated as a function of BER (bit error rate). The more errors, the lower the S/N ratio. The "noise level" is just the number of corrupted packets that are decoded. The more interference, the larger the number of corrupted packets and the higher the "noise level". Please don't ask me how to convert from BER to "noise level". Note that the noise does not solely come from interference.
Netstumbler 0.3.30 runs fine on WinME but with a very limited selection of supported cards. Basically, it needs a Hermes chipset as found in the numerous Orinoco mutations. 0.4.0 will not run on WinME.
Methinks noise is more important. Sending data where all the packets have to be resent over and over due to high noise levels is terribly inefficient. What the access point does is reduce the connection speed to obtain a reasonable BER, usually one bit in 10^6 or 10^5. Therefore, what you'll see is the noise level or percentage moving around quite a bit as the access point changes the speed to match local conditions. Getting a stable number is almost impossible, but you can get a feel for whether it's staying around the low side, or climbing into unacceptable territory.
The signal strength will also vary considerably with the connection speed of the moment. Once above a minimum threshold, the actual signal strength matters little. The problem is where's the threshold. My guess(tm) is about 70% for reasonable peformance, but that's at best a bad guess. The signal strength will also vary as the access point juggles the connection speed making the indicator rather useless for aiming antennas. However, if you change the access point connection speed from "auto" to some fixed speed, the signal meter will magically stablize and can be used for antenna aiming.
List price for 7/8" foam is $6.50/ft. The air dielectric type is about 6.00/ft. I can usually get either for about $4.20/ft (plus shipping).
N connectors are $40/ea. I prefer the 7/16" DIN connectors at $42/ea. I usually burn about $200 in hangers and grounding straps.
Yeah, but I'm just doing short runs (< 25ft). I've used RG-6/u for
802.11 in the past with good effect. AT&T cable (pre-Comcast) wanted to use the CATV wiring in the house to redistribute 802.11, but that never went anywhere. It worked, the losses were horrendous, but they were less than the losses of shooting through the walls.
Also, the end to end losses will be about the same as 50 ohm coax because:
The attenuation per ft of 75 ohm coax is slightly less than the equivalent 50 ohm coax.
A 1/4 wave matching section will take care of the mismatch loss at the access point end while simultaneously acting as an F to R-TNC adapter.
The antenna can be designed for 75 ohms resulting in zero mismatch loss.
There's a slight improvement in bandwidth for 75 ohms over 50 ohm antennas at the band edges.
Incidentally, I've been using RG-6/u, RG-11/u, and various aluminum semi-rigid CATV coax cables for just about everything for quite a while. Long ago, I suggested using RG-6/u for tower top amplifiers in a mailing list. Hyperlink coincidentally had an amplifier product that did exactly that. It works.
Naw, it's easy. It's just a full wave loop with reflector. The impedance is adjusted by the height of the loop above the reflector. The capacitive stubs are to tune out the small inductance from the exposed center conductor of the coax cable, and possibly to reduce harmonic resonances, which would contribute to receiver noise. Crude, but effective.
No ravens here. What we have is an argumentative group of certified tower climbers and installers that cannot seem to agree on the right way to waterproof connectors. Of course, my way (Electrical tape over PTFE pipe wrap) is best.
Gone to the "mountain" (1700ft). Besides draining the Heliax, the major projects are resurrecting a crashed weather station and making sure the skunk doesn't move back in under the storage trailer. If time permits, some work on the emergency generator. Radio and wireless isn't all technology.
Yep, sounds right. Buy an $80 wireless radio. Then buy a $20 pigtail. Then buy a $49 external antenna. Then buy,
2 each N connectors $ 80 30 feet LDF5/50A $180 -------------------------------- $260
Or, that 30' feedline costs almost twice as much as everything else, including the antenna... And thats without adding in $200 for the miscellaneous hardware.
Actually, every time I think about this I have to laugh. The local Borough government, maybe 3-4 years ago, hired a guy to set up some wireless links for them. All of it worked well downtown, but they have one building 2-3 miles out of town. That one has cost them thousands. It never worked right, and twice they've found someone who assured them they can fix it. So they get a contract, but in the end the money is spent and the damned thing still won't work.
I don't know exactly what equipment is on it, but the radio is in the building and the antenna is on a tower outside, with maybe as much as 100' of feed line.
I'll grant that throwing money at it could in fact make it work, but they keep hiring people who don't understand the basic problem, so the money gets thrown in the trash.
Sound engineering though.
If you are matching it at each end, there simply is no added loss from SWR, because there is no (increase in) SWR.
That's really cutting a fine edge there Jeff... ;-)
For years and years I had access to all of the double shielded
75 Ohm foam coax I'd ever want, so I used it for everything including ethernet. As long as you know what the significance is, there really aren't that many places where 50 Ohm cable is significant.
(I've also got several lengths of LDA4/50A in the porch, waiting for any day I happen to decide I need an outside antenna.)
Crude Hell, I thought that was the neat part!
We've agreed on your way before. You *are* right. If that's what works best in *this* environment, it certainly will survive what it gets down that way (and my guess is that some of your mountain top locations are indeed Arctic environments).
Well, a 1700 foot hill won't be an Arctic environment! But still is probably quite enjoyable.
Downloaded and tried ver 0.3.30 and it did not see my network card.
On the package of the WMP54GS it reads, "Attention, SpeedBooster Mode Available Under Windows 2000 & XP Only. In a PC Magazine article it referred to Linksys's Speedbooster as an implementation of Broadcom's "afterburner" technology. So I have 2 questions about that. One: When I look at the advanced properties of the "Linksys Wireless-G PCI Adapter with SpeedBooster" on the WinME machine.... Afterburner is "Enabled". I wonder if it should be. During the installation of the card and the WLAN monitor software there was no mention of this. Two: My laptop computer has a Broadcom 802.11g adapter. It also has an "Afterburner" field and it is enabled. So the question... is my laptop taking advantage of the Linksys "Speedbooster" technology? Also both have something named "XPress (TM) Technology" enabled. Don't know what that is. I will set my AP "Transmission Rate" from auto to 6Mb to play around with the new antenna aiming. That's a great tip. And all the information you supply is great. I thank you.
Broadcom chipset on both ends. I don't have data on the current crop of chipsets. However, the much earlier models simply used the error rate as a form of "noise and interference" measurement. Any packets that arrived, but could not be decoded, were considered "noise". Not great, but good enough.
Netstumbler is 0.3.30 on WinME is not going to work with a non-Hermes based chipset on your WMP54GS. Sorry. However, if you upgrade to W2K or XP, version 0.4.0 will work with any card that has an NDIS 5.1 driver, which is just about every card with an up to date driver.
Time for simple decision. Do you want speed or do you want distance? You can't have both. Turning on Afterburner will enable the system to go faster then 54Mbits/sec connection speed. The box claims
108Mbits/sec. Broadcom currently calls it "125 High Speed Mode" which should elevate the technology to the next level of marketing hype.
'm sure even faster speeds can be obtained.
Just one problem. For a given modulation method, with a given bandwidth, at a given error rate, speed and range are complimentary. At 54Mbits/sec, you can maintain a *RELIABLE* connection up to about
15-20ft. Go any farther and the error rate creeps up sufficiently to cause the access pont to slow down the connection speed. A fun test is to set the access point to a fixed 54Mbits/sec connection speed, and see how far you can go before traffic just stops. Use some streaming media with a small buffer size for this fun test.
You can see the effects of speed versus distance in the receiver sensitivity numbers. These are from a DI-624 but should be close enough for the WRT54G. PER is "packet error rate" where 10% means that 10% of the packets arriving are trash. * 54Mbps OFDM, 10% PER, -68dBm) * 48Mbps OFDM, 10% PER, -68dBm) * 36Mbps OFDM, 10% PER, -75dBm) * 24Mbps OFDM, 10% PER, -79dBm) * 18Mbps OFDM, 10% PER, -82dBm) * 12Mbps OFDM, 10% PER, -84dBm) * 11Mbps CCK, 8% PER, -82dBm) * 9Mbps OFDM, 10% PER, -87dBm) * 6Mbps OFDM, 10% PER, -88dBm) * 5.5Mbps CCK, 8% PER, -85dBm) * 2Mbps QPSK, 8% PER, -86dBm) * 1Mbps BPSK, 8% PER, -89dBm) Comparing 54Mbits/sec at -68dBm and 6Mbits/sec at -88dB yields a 20dB difference. To put that into perspective, 6dB is double the range,
12dB is 4 times the range, and 20dB is 10 times the range. Therefore, it you can go 20ft at 54Mbits/sec, then you can go 200ft at
6Mbits/sec. I don't have numbers for 108Mbits/sec but my guess(tm) is that it will be about 2/3 the range of 54Mbits/sec.
This might also be of interest. |
To make matters worse, the stupid access points absolutely insist on favoring speed over all else. The slightest increase in S/N ratio will cause the access point to increase the speed. I've watched one (name withheld) literally spew management packets constantly changing the speed (in the presense of interference). It might make sense if the access point firmware allowed some fine grain control over the speed juggling algorithm, but all we get is a choice of "fixed speed" or "auto" which can mean almost anything.
In my never humble opinion, speed boost is a waste of time and effort except for extremely close (desktop) connections.
"XPress" is the original Broadcom name for the technology. It was later changed to "Afterburner", probably because of trademark issues. Linksys marketing renamed it SpeedBoost in order to create "product differentiation" and customer confusion. They're all identical. They are not the same a "Super-G", Prism "Nitro", and AirPlus Xtreme G+.
If you're only sharing a DSL or cable modem connection 6MBits/sec might be enough. You'll get about half that in thruput. If your DSL or cable can do faster than 3Mbits/sec, then I would go to a higher speed such as 9 or 12Mbits/sec.
Some other suggestions:
If you only have 802.11g radios that need to connect, set the access point for "802.11g only" and it won't slow down to a crawl every time it hears an 802.11b signal.
I see some of that. I have a really bad habit of working out the numbers before I install the hardware. If it doesn't work on paper, it's not going to work in the field. We have a local university doing a 16 mile 5.8Ghz shot through the water/land transition zone. The most optimistic fade margin I can calculate is 13dB, which is not very good. I'm trying to be helpful, but nobody involved wants to listen to the VoD (voice of doom) constantly predicting disaster. Two weeks ago, the system was finally installed. It took all day to get the 3ft dish at one end aligned. I found an excuse not to get involved in the install. Measured fade margin was 10dB with an attenuator in the line. I'm currently monitoring one end and showing an 80% packet loss and about 2hrs of outage per day. It's not going to work. So, this morning, I get the inevitable phone call demanding that I "make it work". Sigh. I have a funny feeling that you're local wireless contractors may be in a similar position.
I'm only matching at one end. Then antenna is designed for 75 ohms. The 1/4 wave 62 ohm matching section is only at one end and doubles as an F to R-TNC adapter. However, I screwed up when I calculated the length of my home made copper tubing 62 ohm line. The problem is figuring out from where to measure the length. I may need to build a line stretcher and do this empirically (cut-n-try).
Not really. A basic rule of thumb is that every time I add anything that's resonant to the system (i.e. 1/4 wave matching section), the useable bandwidth becomes narrower. Every time I increase the gain, the system also gets narrower in bandwidth. It may not be an issue with a loop, biquad or cantenna, where the gain is under 10dBi. However, at higher gains, it will become a problem. I'll take every bit of bandwidth I can get. I'll post a comparison when I get the design and testing done.
Yep. I've done 3 LAN installs using 75ohm CATV coax and 10base2 transceivers. However, you must use 50 ohm terminators, not 75 ohm. That's because the xmit detection mechanism on 10base2 (cheapernet) is a DC voltage shift, where the 50 ohm terminators are part of the voltage divider.
I'm always right. (I'm also not into modesty, humility, tact, diplomacy, etc.)
The Peoples Republic of Santa Cruz County (Calif) has a rather mild climate. That's why I like it here. Weatherproofing isn't much of an issue here. It rarely gets below freezing. I'm not sure how well my PTFE wrap method will work in your environment. My guess is my choice of cheapo outer wrap tape will freeze, crack, fall off, or otherwise fail. I would probably have to switch to shrink tube or 3M cold shrink.
Incidentally, the skunk was gone and the Heliax was fine.
Oh boy, over-water shots are *bad news* without a *huge* fade margin.
If that is over tidal water they are going to have some serious outages. If it is not tidal, it may appear to work for a *long* time, and then just slip away to nothing... for weeks or months.
Then both ends are matched.
How do you adjust it... take a file to the end of the stub? :-)
But how much bandwidth do you gain from 50 to 75 Ohms?
Regardless, yes do post the results when you done! Sounds interesting.
Most folks don't understand that, and of course one 75 Ohm term and it doesn't work at all, zilch... nada to nothing. So they figure its because it's the wrong coax.
Put 50 Ohm terms on it, and the only problem is just how long a run you can have, which will be slightly shorter than with 50 Ohm coax.
I noticed that. ;-)
The odd thing about you, is you don't seem to be a real asshole. Now, mind you I worked 34 years with guys just like me, "also not into modesty, humility, tact, diplomacy, etc", who are almost *all* a bunch of total assholes. Some of them were so right wing they had no left arm at all because it atrophied and fell off.
So I appreciate your sense of humor and modesty about lacking modesty.
The electrical tape outer wrap lasts very well here. It would not provide a good enough seal by itself, but as an outer cover it does very well. In fact, trying to take off at cold temperatures can be a real problem. No flexibility at all! It can be hard to apply too, for that reason, but if done right it lasts.
We don't have skunks. And I prefer to deal with ravens any day.
Yep. I made a nifty graph showing the relationship between tidal heights and signal fade on a 10 mile over the water 420MHz link. Direct correlation for about two months. Well, it wasn't perfect as the surface conditions had a big effect. Flat water was the worst. I also know of several 10-15 mile 2.4GHz links that are killed when the inversion layer appears. However, the transition zone between water and land is the worst. You have the multipath reflections off the water combined with the diffraction of the inversion layer. I don't think any fade margin will be sufficient for such a path. Instead, I try for spacial diversity, which is a fancy term for two pairs of antennas.
Nope. It's tidal. It's too soon to demonstrate the relationship between the tide height and the deep fades. I wanna let the experts burn out on their ideas before I do something disgusting, like install a diversity switch with multiple antennas at one end.
Nope. I got two ways that work:
I build a telescopeing center conductor out of chunk of copper wire and an outer copper or brass sleeve. There will be a slight impedance bump at the sleeve but it's not serious. All that's important is that the radio of the O.D. of the center conductor, to the I.D. of the outer tubing. Everything is copper pipe or brass tubeing. It ends up looking like a fat telescopeing whip antenna with connectors on both ends.
Same tubeing as above but no telescopeing section. Instead, the F connector end is a female to female F connector (barrel) that conveniently slides in and out on the center conductor. There's only about 1/4" of adjustment range, but that's usually sufficient.
My test equipment is really a bad joke. I built a Wheatstone bridge out of 3ea 51 ohm chip resitors. In place of where the 4th resistor goes, I place the unknown impedance. In this case, that's the 50 ohm end of the matching section. Across one diagonal connection, I have my worthless sweep generator. The other diagonal goes to an RF detector, amplfier in the sweeper, and finally to a scope. It displays VSWR versus frequency. I telescope the matching section until the sweep looks good. Then I try different lengths of 75 ohm coax between the matching section and the 75 ohm antenna or dummy load. When the length of the coax has no effect, I know I have a proper matching section.
Good question. I don't know yet because my graphs don't extend out far enough in frequency. The 50 ohm plot for vswr shows about 1.3:1 at the band edges: |
haven't posted the 75 ohm plot, but it's about 1.25:1 at the band edges. Next plot will cover a wider frequency range so I can conjur better numbers.
I was building matching networks that looked like 50 ohms at DC and 75 ohms starting at about 1Mhz. Eventually, I figured out it was un-necessary. 50 ohms was just fine for terminating the 75 ohm line. When the attenuation is sky high, the VSWR has no effect on the signal source as the attenuation totally destroys the return signal. I could put a 75 ohm VSWR meter on the source end, and get a good 1:1 match simply because there is no return signal. However, there are standing waves along the coax, so there can only be two transceivers, one at each end, with no taps or T-connectors along the coax line.
My current record for RG-6/u is about 950ft:
Ummm... thanks, I think. I try to be nice, but don't often succeed. The problem is that customers expect competent computer and engineering types to be technically arrogant, socially retarded, awkward, bungling, nerds or geeks. There was even a recent TV show underscoreing the stereotype.
've tried to be less grouchy, more diplomatic, and socially cluefull, and had my customers wonder if I was sick or something. So, I gave up and decided to just be myself. That extends to answering questions in newsgroups and mailing lists. If anyone wants un-natural diplomacy, tact, or politeness from me, let them pay my exhorbitant consulting rates.
Thanks. Good to know. Several companies offer "cold weather electrical tape". I know the Scotch 66 I normally use doesn't like cold weather and I have to switch to Scotch 88 instead.
You don't know what you're missing. You can order a sample here:
What type of "directional anteanna"? A simple antenna (i.e. biquad, patch, coffee can), has a -3dB beamwidth of about 45-60 degrees. That's anything but critical. What are you moving 1/2 an inch that produces such directionality.
The constant changes are from reflections.
One wavelength at 2.4Ghz is about 12.5cm or about 5 inches long. The typical short 2.4Ghz antenna is a coaxial antenna, where 1/4 wavelength is exposed, 1/4 wavelength of the shield is folded back over the outer jacket, and the rest is just the mechanical hinge. These are typically about 9cm long (from the hinge) and are found on most PCI card radios (because they're small and will fit).
The longer antennas add an additional 1/4 wave decoupling section at the bottom to reduce VSWR which adds about 3cm. These are about 15cm long (from the hinge). The extra length also moves the radiating part of the antenna well above the top of the router box for improved antenna pattern (and clearance). These are much better than the shorter antennas.
There is also a bunch of higher gain replacment antennas which a vertical collinear arrangement. See: |
Not exactly. It will pick the antenna to which the last successfully received packet arrived. It will stay with that antenna until either it times out and starts scanning, or that some algorithm based upon successful (or corrupted) packet reception kicks in. There may be a substantial time delay before it starts scanning again. I hot wired my WRT54G to light up a red/green LED when the diversity switch does it's thing. It's not very interesting because it rarely switches.
Officially, there is no interaction between the two antennas. They are totally indpendent. However, the real answer is that there is some, but it's not enough to be useful for creating a directional antenna.
Adding the 2nd antenna will not improve the signal. Since only one antenna is active at a time, there is no added gain from the 2nd antenna. Therefore the signal strength will not improve. What will improve is an improvement in reducing the effects of multipath and reflections. If that's an issue, use two antennas. If you're trying for a distance record, only one antenna is necessary.
You would not believe how bad the VSWR of those antennas really are. Under ideal conditions, the antennas by themselves have fairly good VSWR. However, plant them anywhere near metal (PC case), or the usual blob of wires in back of a machine, and the VSWR is becomes truely disgusting. Other than pounding pins into the antenna or coax, there's not much you can do to make the VSWR any worse.
Today I finally got around to testing the directional antenna by fixing my WRT54GS transmission rate to 6Mbs. I found that the antenna position is very critical if I want the maximum signal. Half inch one way or the other resulted in a loss of 2-3db or so on average. The reported signal strength is constantly changing but there seems to be one exact place it wants to point. But now I have another (related) question. I just noticed that the antenna that shipped with the WMP54GS (pci card) and an older WMP11 card is physically longer than the antennas that are used on the WRT54GS and my old BEFW11S4. The PCI versions are 7 3/4 inches long (base to tip) and the antennas for the router / APs are 5 5/8 inches long. So question is... why is that? Does the Linksys router / AP use diversity switching? (Which to me means that the router picks the best antenna to use at any given moment.) Or do the 2 antennas work together to appear to the router as one antenna and the length of each is somehow summed? (pardon my weak antenna knowledge and how I am asking the question.) I happened to have ONE adapter that allowed me to connect ONE of the longer antennas to the router. (one long / one short) My signal strength "seems" to be a bit better. Signal fluctuates but on average is 2 db better. Does this sound logical to you? If so I'll shop for another adaptor so I can use both of my leftover longer antennas on the router. Or am I messing up the SWR of the transmitter part of the radio and deceiving myself?