How about multiple cheap routers spread out with the *same* SSID to allow roaming? That would allow the clients to pick the one with the strongest signal at the time, and should naturally load balance by location. The main difference would be the need to run wire to each of the router locations, whereas with a single unit you only have to run the one wire.
I can't seem to find the reference I use that goes over the pros/cons of the two methods right now, but I know I found it through Google.
Rule of thumb for access point loading: 100 casual email and web surfers 10 business type users 1 file sharing or Bitorrent user
Is this an economy project where you want to repeat all the mistakes of other skools, or would it be better if you simply copied or adapted a system that actually works?
In general, commodity wireless access points and routers do not scale well into large systems. The wireless part of the puzzle is the least of your problems. Pretend there was no wireless and that everyone was wired into the system with CAT5. You still need traffic management, backhaul load balancing, authentication, security, monitoring, SNMP, VPN termination, traffic monitoring, intrusion detection, and abuse control. In many cases, the administration wants the wireless LAN admins to play policeman for student servers, illegal file sharing, and non-academic downloading. Once you have all that organized, then you can hang the wireless boxes in place of the CAT5 cable. It's possible that your skool may already have all that in place, but my experience is that it's often a different group or faction that wants to impliment wireless and is lacking control or cooperation of computing services. One college networks was planned, funded, parts purchased, but stalled for over a year because of insurance issues with rooftop antennas and shared conduit for the backhaul. As always, politics is the 8th layer of the ISO protocol stack.
UCSC Residential wireless net (200 Cisco nodes).
Try searching Google for "college wireless" and see what others have done. Find one that's local and looks functional and ask some questions. Make friends, etc.
I was at a server farm run by an ISP and one of the more clueful techy types fired off a BitTorrent download of a series of Linux cdrom images he needed. With one desktop, he manage to saturate all 45Mbits of dedicated bandwidth of a fractional OC-3 connection. When the phones started ringing with complaints about slothish performance, we knew we had screwed up. Looking at the traffic graphs showed 100% utilization for about 10 minutes, which was enough to stop the entire ISP in its tracks. It seems the bandwidth management system only applied to customer connections, not the ISP's local machines (since then fixed).
My guess is that you were using all 512Kbit/sec continuously and possibly in both directions. It's not just BitTorrent, but any peer-to-peer software that turns your workstation into a server. You can configure all of these to limit the outgoing bandwidth, but the default configuration is often "use all the bandwidth you can". Note that there are many implimentation of the BitTorrent protocol.
Downloading is usually not the problem. It's your outgoing bandwidth where BitTorrent turns your machine into a server. Wireless is a symetrical system. DSL and cable modems are asymetrical. Such systems use little outgoing bandwidth. Think of it in terms of the ISP. The incoming bandwidth is used by customers for downloading. The ISP's outgoing bandwidth is used by his server farm for web hosting and servers. Usually, it's an equal balance between the two. However, when you turn your machine into a server, you're adding to the outgoing load.
Wireless is also unique in the airtime (when you're transmitting) is shared among everyone within the range of the wireless access point. That means every millisecond of your airtime is substracted from someone elses airtime. It's one big shared system (per channel) with only one radio transmitting at a time. It doesn't care if your transmitting or the access point is transmitting (i.e. simplex opration), both use airtime.
Bad assumption and possibly a badly impliment system. What seems to be lacking (not sure) at your wireless ISP is some form of bandwidth management to control your use of the bandwidth and limit your abilities to saturate the system. In this case, the bandwidth management needs to control your outgoing traffic, which must be done at your wireless CPE router or bridge. It makes no sense to just drop packets at the ISP router. The idea is to prevent your radio from transmitting too much or too often. Your ISP may already have some bandwidth management in place. The problem is that BitTorrent can trick some bandwidth managers by opening a large number of simultaneous streams.
Some rhetorical questions:
Does your ISP allow servers? Many do not. Check the terms of service.
Does your ISP have a data transfer size limit? XX GB per day/week/whatever.
Is there a clear definition of abuse in terms of traffic and allowable time to utilize the full bandwidth of the system? This is tricky and is almost a must for WISP service.
Any idea of your ISP's bandwidth loading factor? Number of kbits/sec sold per T1 or whatever they use as a backhaul. 10:1 is good. 20:1 about the limit of sanity. For example, if all the users were 512/512Kbits/sec, and the backhaul was a single T1, a 10:1 ratio would allow for about 30 users per T1. 10 * 1.5Mbits/sec / 512Kbitsec = 15/0.5 = 30
The bottom line is that WISP service requires some restraint and cooperation on your part to work. While it is technically possible for a single user to hog almost all the available bandwidth, it is not considered a good idea to do so.
Not meaning to change the thread direction here but I read the above with interest.
Recently I have been using Azurus (Bitorrent downloading software) for 1.5 days solid and finally my ISP contacted me and asked me to turn it off as it was draining the system down. The system is a wireless one which I have been using for a few years now. My access speed is capped at 512kbs up and down and the upload was set to its minimum. How can this have such a huge effect on the system? It shouldn't be any worse than downloading large files continuously should it? I can see how it is going to use all your resources up on a smaller system with limited band width but a large commercial system should be able to handle it shouldn't it?
Jeff Liebermann wrote in news: firstname.lastname@example.org:
I must say that was a great post, technical enough where I actually read the entire post. And at the same time, not overly so. I just thought it was well-explained (?). A way that even people that don't know too much more than the power button, can kind of get the concept.