An Ofcom-commissioned report into Wi-Fi performance concludes that it's baby-listeners and TV-senders that are mucking with the signal, not to mention the "Free Public Wi-Fi" virus, without which we'd all be connecting faster.
Ofcom's remit is to ensure efficient use of radio spectrum, including the unlicensed 2.4GHz band used by Wi-Fi. To that end, it commissioned specialist consultancy Mass to examine how effectively the band is being exploited.
What Mass discovered
is that while Wi-Fi users blame nearby networks for slowing down their connectivity, in reality the problem is people watching retransmitted TV in the bedroom while listening to their offspring sleeping, and there's not a lot the regulator can do about it.
Outside central London that is: in the middle of The Smoke there really are too many networks, with resends, beacons and housekeeping filling 90 per cent of the data frames sent over Wi-Fi. This leaves only 10 per cent for users' data. In fact, the study found that operating overheads for wireless Ethernet were much higher than anticipated, except in Bournemouth for some reason: down on the south coast 44 per cent of frames contain user data.
Yawn. Beacon frames are typically about 50 bytes. Management frames vary and are typically about 100 bytes. ACK's are about 50 bytes. Meanwhile data frames can be up to 2347 bytes. Yeah, one would expect a larger *NUMBER* of the smaller frames to be flying around. The report would be less alarmist if they calculated the air time used by the various packet types. The only item I can find significant is the number and ratio of retransmissions to data packets. In some areas, retransmissions were found to be 80% of all data packets. That sucks, but is not unusual in the presence of interference, reflections, and other networks.
I find it somewhat amusing that they discovered that they have similar problems in less densely populated areas, but didn't bother sniffing in sparsely populated areas, which might have led them to conclude that such traffic patterns can appear anywhere. A clue could have been obtained from the typical mesh network lack of efficiency:
Note that the average delivery probability for 11Mbits/sec is about
30%, and about 50% for 1Mbit/sec for large 1500 byte payload packets. Lots of retransmissions flying around that network. That was done in
2004, where one would expect less interference.
I guess they would be amused to sniff a municipal wireless network (i.e. Mountain View, CA), during high usage hours, where perhaps 70% of the packets (not the airtime) is used by broadcasts and management frames, and where perhaps 95% of the payload packets are ARP requests and replies. However, that's packet count, not air time. If one counted air time use, my guess it would be 50% broadcasts and management.
I always look for the punch line of such research. In the "conclusions" section: "In the long term this could be reduced by enforcing coexistence criteria via the standardization committees." Yep. Additional regulation and coordination will be added until performance improves.
The real problem is that Wi-Fi is sharing the channel with an increasing number of analog RF devices that do not release any air time. Wi-Fi works because DCF (Distributed Control Function) will allow transmission only if it thinks the channel is clear. If the CSMA/CA mechanism detects garbage, no transmission. However, wireless TV extensions and video cameras have no such mechanism. They just spew RF 24x7 even when there's nothing to watch. In the presence of one of these (on the same channels), Wi-Fi has no air time available, and therefore, doesn't work at all.
For example, a local restaurant installed a non-802.11 wireless security camera to monitor their trash can. The camera also managed to trash wi-fi at 2 coffee shop hot spots. I had no clue as to the cause, until someone stole the camera off the wall, and the interference miraculously ended. (No, it wasn't me that stole the camera).
The moral is rather simple, but seems to be lost on the report authors. It's something the commercial land mobile radio and public safety people have known for years. You can't mix digital and analog radios on the same channels, period.
Also, I note that the authors didn't seem to find it necessary to actually drag out a spectrum analyzer and look at what they were documenting.
Another clue is that in 7.6.3 they incorrectly identified the "free public wi-fi" problem, which they claim is a virus. It's not. It's an MS bug which I still see in XP pre-SP2 boxes:
The report is about 150 pages long. I'll skim it more closely later.
Yep, if you count packets. However, since the air time of a payload frame is something like 10 times the air time of the a management frame or beacon, I would have been more interested in the air time occupied, than simply counting variable size packets. Also, it would have been interesting to get a tabulation of the raw data speeds of these assorted packets. The beacons and management frames are all at either 1MBit/sec for 802.11b, or 6Mbits/sec for 802.11g even if the data rates are 11 and 54Mbits/sec respectively. That creates a rather complexicated airtime analysis.
Well, the consulatancy that did the study was could have done better, especially if they had bothered to rent a spectrum analyzer. What bothers me is why the OFCOM bothered to commission the report. More specifically, what problem are they trying to solve? My conspiracy theory of the moment is that the OFCOM wants "more control", "expanded powers" or something like that. A report that find disorder and suggest additional regulations and control is exactly what they would want.
My guess(tm) is whatever happens, there will be zero mention or consideration for seperating digital and analog modulation methods on different parts of the 2.4GHz band. That would eliminate most of the interference problem, but do nothing for the OFCOM.
" There is anecdotal information and discussion that WiFi systems are highly congested in cities and are failing to work in certain locations. We commissioned the design of a novel monitoring system based on smartphone devices with GPS and WiFi capabilities. These were then carried around city centres, including into a number of buildings such as railway stations, where they monitored key beacon signals transmitted from WiFi networks and measured performance parameters. A test network was also built in the laboratory to try to replicate the measurements made."
It seems to be part of their trying to find different methods to obtain information about spectrum usage. They have also been using equipment installed in a companies vehicle fleet to obtain information about the
10MHz to 6GHz bands.
Nice of them to find an illegal transmitter. "Further analysis showed that the source is probably a business in the centre of the village which has installed a wireless CCTV system, probably imported into the UK without the purchaser being aware that it is not licensed for use in the UK"
The less busy the 802.11 channel, the greater the percentage (whether reckoned by packet count, byte count or duration) of channel utilization by beacons.
Assuming that you don't consider idle time.
In other words: if no one's using the channel, then you won't see anything but beacons.
That said ... it's certainly possible for beacons to become a non-negligible percentage of actual total channel capacity. Consider that APs may be configured to beacon at 1Mbps. Consider that beacons may be as large as 300 bytes. Consider that there may be 10 APs at -80dBm or hotter on the same channel. Consider that (with our lightweight APs, at any rate), each AP may beacon 8 different BSSIDs.