Free space loss

In the past I argued the higher the frequency, the greater the attenuation, but failed to note I was referring to terrestrial propagation...but I'll get to that later.

Lets look at the traditional free space loss formula: Path loss in dB = 32.4 + 20 log(f) + 20 log(d), where f is frequency in MHz and d is distance in miles.

It implies increasing the frequency will increase the path loss (greater attenuation). When you run a plot at 2.4 GHz and one at 5.8 GHz, you'll find there is 7.7 dB more loss at 5.8 GHz.

If you go backwards in the equation and see how it is derived, you'll find the capture area (antenna aperture) defined as wavelength squared divided by four times Pi.

For example - Lets take two simple dipoles for 2.4 GHz and 5.8 GHz with the ends of the dipoles at 2.45 inches and 1 inch apart respectively. Using the above formula, we find the capture areas are

1.912 and .318 square inches. Divide the area of the first antenna by the second antenna's area and you'll get 6, or six times smaller (so you would express it as a negative 6)

Convert -6 to dB and you get -7.7, the same loss number you go in the plots you ran above.

Therefore, there really isn't greater attenuation as you increase the frequency, rather the antenna is "less sensitive".

---------------------------------------

Now...on to terrestrial losses.

I placed a 2.4 GHz and a 5.8 GHz transmitter with simple vertical dipole antennas on the output connectors at 500 feet up a tower and measured the signal level from another 200 foot tower nearby with simple vertical dipole antennas. This is as close as you'll get to true free space on the face of the earth. As expected, the 5.8 GHz signal was 7.7 dB lower. I had similar signal differences from several test points with antennas on a forty foot mast and a clear line of site.

I ran the tests again at several locations in the county with the transmitters on a forty foot mast and ten foot mast for the receiver antennas. I saw some additional loss from trees as expected and a greater loss at 5.8 GHz. The additional loss varied with the terrain and foliage. The conclusion would be there are greater terrestrial losses at higher frequencies.

Eventually I'll get a system set up where I can measure hourly signal levels from several other WISP's access points from miles away over a year period. I suspect I'll see the same results my casual testing has shown...path fading is worst for a few weeks in the spring and a few weeks in the fall, and about an hour after sun up after the sun warms the earth and you have a layer of warm air under the cooler morning air, and an hour before sun down when the air starts to cool off and drift into low spots under the still warm evening air.

Reply to
nevtxjustin
Loading thread data ...

Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.