Wireless Internet service antenna, radiation

Hello,

I am considering getting Internet service through our local wireless provider. They apparently use either a parabolic antenna, or sometimes a yagi type of antenna. What kind of radiation do these antennas emit and in what directions, with this type of service? I am not sure which receiver they use with these antennas. I have been trying to find more information and so far have come up empty. I am more concerned about this for health reasons. We would probably be mounting this antenna on the side of the house or the roof, and I would like to determine the best place so there's not a lot of radiation being reflected in to the house.

Thank you very much for all feedback!

-- Chris

Reply to
szilagyic
Loading thread data ...

Hi Chris,

There is probably more radiation from your microwave leaking out than you will find coming from any wireless ISP. More your problem is getting enough of it, not too much.

For your end of it (transmitting), if your provider gives you a dish or yagi, it is to purposely keep power levels down and keep an efficient link up.

73's Richard Clark, KB7QHC
Reply to
Richard Clark

Richard Clark hath wroth:

Ummmm... shall we do the math?

Microwave evens are required to be below 5 mw/sq-cm at a distance of 5 cm. Food and Drug Administration/Center for Devices and Radiological Health (FDA/CDRH) performance requirements in Title 21, CFR, Part

1030.10. Ugh.

I'll assume the typical 50mw wireless access point, with the usual

2dBi rubber ducky antenna. It's easy to calculate if you assume that the radiation pattern from the rubber ducky is a sphere and you ignore near field effects. The surface area of the sphere is: 4*Pi* radius^2 = 4 * 3.14 * 5cm ^2 = 314 sq-cm. The 50 mw of RF is spread equally over the surface so the power density is: 50 mw / 314 sq-cm = 0.16 mw/sq-cm which is MUCH less than the 5mw/sq-cm limit for used microwave ovens, or the 1mw/sq-cm required for new microwave ovens.

The actual power density is slightly higher because the pattern is really a torus and NOT a sphere, but it's not going to change very much. My guess(tm) is double but I'm too lazy to grind the numbers exactly. Anyway, at 5cm test distance, you're safer with a wi-fi access point than with a microwave oven.

Reply to
Jeff Liebermann

Nice analysis! One point: the wireless signal is constant, but the household microwave signal is semi-discreet in time.

If the microwave is in use one hour per day [ the 1 hour per day use of a microwave oven is in my experience far greater than most people use for the cup of tea, soup, etc.], the equivalent energy exposure of the two sources of radiation are (using your numbers):

Household microwave oven: 1 hour * 5 mw/sq cm = 3600sec * 5mw/sq cm =

1800 mw-sec/sq cm.

Wireless transmitter: 24 hr * 0.16 mw/sq cm = 24 * 3600 sec/hr * 0.16 mw/sq cm = 13,825 mw-sec/sq cm.

The total energy exposure from the wireless is 7.68 times the microwave exposure on an integrated basis. This is admittedly a quick estimate, since almost no one is located directly to either source in practice. Yet, it is not conclusive that the wireless is trivial compared to the microwave.

Q
Reply to
Quaoar

ah, but don't forget that most users that have wireless access points also have a wireless client computer that is also transmitting the same type of signal.... and they sit within easy reach of those computers, often for long hours.

Reply to
Dave

It may be on all the time, but the radio only transmits for the moment that it has to, no? You would have to be running a LOT of data through to get constant transmission if I understand correctly. So, imagine, "click>burst>wait long time>burst> wait>burst burst. I think if you could time it, it might not add up to that much time it's on.

And then a client adapter will typically be transmitting a lot less than the AP as most internet usage is assymetrical.

Finally, in the case mentioned, they are putting a yagi or dish (very directional antennas ) on the side of the house. It isn't even near anybody. So, first of all, you will probably have to get up on a ladder and stick your face in the antenna to catch any rays and then it will only be a burst while your pc requests a page. And even then it will be weak, like a cellphone.

As far as the signal from the ISP, well, that's all around you whether you have an antenna or not. City folks are certainly bathed in a constant barrage of microwave and other radio signals. Good thing they are so weak.

Steve

Reply to
seaweedsteve

Quaoar hath wroth:

You have it backwards. When belching RF, the microwave oven power envelope looks like a half wave rectified waveform at 120Hz (twice the power line frequency) at "full power". The average power is about 70% of the peak power. However, you're correct that most people don't run a microwave oven continuously all day long. (Exception... Microwave plastic injection molding pre-heater and other industrial microwave ovens). I think (guess) that FDA/CDRH 1030.10 specifies RMS (heating) or average power, not peak power.

802.11g has an even lower average duty cycle. FCC 15.247 specs are written to prevent any one spread spectrum wireless device from hogging all the air time. That means that it not only doesn't sit on one frequency, but also turns off the power for some time to allow other devices on the same RF channel to function. I don't recall the exact numbers and am too burned out tonite to dig through the specs. However, I did find:

which tries to analyze the typical transmit duty cycle for 802.11g. The authors guess is about 10% for typical traffic, which I guess is a good typical value. Like the microwave oven, most users are not belching wireless data continuously. (Exception... streaming wireless video).

OK, I'll use that. However, note that the 5mw/sq-cm is probably average power.

Nope. You forgot to apply the duty cycle of 10% transmission time. That reduces it to 1383 mw-sec/sq-cm. However, you're also assuming that this user spends 24 hours in front of the wireless access point. That's possible for confirmed programmers, hackers, and fanatical gamers, but methinks the average user will see a much shorter exposure time. If we assume a workplace model, I would guess 8 hours exposure which would reduce the exposure rate to 461 mw-sec/sq-cm, or about 1/4 that of the microwave oven.

Similarly, the 10% average duty cycle varies by the type of user. The fanatical file sharing addict will probably approach 100% duty cycle, while the light weight mail and web page surfer, will be close to 10%.

The problem with this type of analysis is that the various assumptions that have been made in order to generate a single "typical" value have such a wide range of potential errors, that the resultant conglomerated calculated values are almost worthless. The main problem is that the typical wireless user may not also be a typical microwave oven user. In addition, we've ignored the distance from the RF emitter, which has a huge effect on exposure (inverse square law).

I suspect that I might be the worst case user. My Verizon XV6700 cell phone has an 802.11b radio inside. I often leave it on doing the WiFiFoFum data collection. Even at 10% or less duty cycle, the proximity of the RF emitter on my belt dramatically increases my exposure.

The correct way to do this would be some manner of dosimeter, similar to an ionizing radiation dosimeter. It takes into consideration duty cycle, signal strength, distance from emitter, and such. It just accumulates the total RF power exposure. I could probably design and build one, but I don't think that selling to the paranoid market is going to be a accepted as a winning business model.

Reply to
Jeff Liebermann

I challenge your 10% duty cycle on wireless. My Belkin router is contantly transmitting, if I can believe the wireless activity LED. No matter, this is not a personal challenge, just a search for reality.

Q
Reply to
Quaoar

Quaoar hath wroth:

See:

The author measured peak and average power, using the ratio to determine the average duty cycle. Flooding the pipe with lots of traffic (i.e. streaming video) might result in perhaps 80-90% transmission time, depending on whether it has to wait for acknowledgements (TCP), or just spews data (UDP). However, typical mixed traffic seems to have a very low duty cycle, which is where I conjured my 10%. Duz this help or do you want more detail?

Also, the LED is NOT a direct indication that the device is transmitting. The transmission times are so short, that the LED would barely be visible if directly connected to the T/R switch. You would only see a faint flicker. In order to see the flashing light better, the designers implimented a pulse stretcher function, that extends the time the LED is turned on, so that you could actually see the light.

I would be happy to duplicate the tests and post the resultant oscilloscope pictures and calculations. It's actually a very simple setup and test. However, I'm burned out and have a few things to do tonite. Give me a few days as I'm planning to take some time off to catch up on all my broken promises and chores.

Reply to
Jeff Liebermann

Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.