802.11g @ 6M or 802.11b @ 5.5

Which would perform better in poor signal levels. say -85dBm.

I can find plenty of sites comparing the 2 technologies, an OFDM is a better and more robust modulation. Anyone got any real life benefits?

Reply to
TheDragon
Loading thread data ...

Which what? Oh, it's in the subject line. Do me a favor and NOT put half your question in the subject line, and the other half in the body. I get easily confused.

OFDM is more resistant to multipath (frequency selective fading) than CCK. OFDM is also slightly more resistant to interference effects than CCK. You'll also get more range out of 6Mbit/sec ODFM because the theoretical receiver sensitivity is about 3dB better than

5.5Mbits/sec CCK. I can supply numbers (tomorrow) if you need them.

I like to lock the speeds of my point to point links to some minimum speed. I usually use 12Mbits/sec OFDM. The idea is so that the link doesn't constantly change speeds up or down.

If you want real-life examples, it would be helpful if you would explain what you're trying to accomplish.

Reply to
Jeff Liebermann

Thanks for your response. Yes a few real life examples would be great.

I have built a cyber cafe and recently added WiFi access both inside and outside (Higher Power, up a pole)

The outside users range from close by laptops, to distant fixed connections using external antennas. I noticed the speed was always changing, often down as low as 1mbs, so I fixed it thinking 802.11b was better as I figured the speeds of g wasn't really needed. so I fixed it at 5.5mbs. Then reading about modulation for wifi discovered OFDM used in 802.11g at 6 mbs+ maybe would be better for the weaker users. Fixed so the weaker users doesn't drag the entire system down to 1M just so they can connect reliably, I would rather a user not be able to connect and force them to move location.

I chose 6mbs as a trade between speed and SNR requirements.

The backhaul ADSL link is only 1Mbs so high speed WiFi isnt needed.

Reply to
TheDragon

Do you really want long distance users?

In the presence of interference, the bit error rate climbs, which causes the access point to reduce speed.

Wrong. The advantage of using higher speeds is that they use less air time. That leaves more air time for the slower connections.

I use 12Mbits/sec. 6 and 9 are just too slow.

Well, one option that you should consider is to simply disable 802.11b speeds. That sets the minimum speed to 6Mbits/sec. Beacon and management frames are also sent at 6Mbits/sec instead of 1Mbit/sec, yielding more air time (and thruput).

If you can see collisions on your AP, you'll probably find that higher speeds drastically reduce those.

Have you checked your traffic for abusers? Things like massive downloads, running servers, Bitorrent, and such tend to bring the system to a crawl. Good luck.

Reply to
Jeff Liebermann

I have already disabled 802.11b to deny access to these slow devices.

I don't have an option to set a min speed, only a fixed speed. I have options of Tx Rate and Default rate, which is default, 1-2 or all. I have left the default rate at 1-2, and the Tx Rate is on 6Mbs forcing everyone to connect at 6 or nothing. I may look at increasing this to 12, at the expense of blocking access to some long range users.

Reply to
TheDragon

Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.