If I were to set up an ad-hoc network of 3 computers, 2 with wireless G and 1 with wireless B, would the 2 computers with wireless G be able to communicate with each other at 54 Mbps, or would the whole network be slowed down to wireless B speed when the Wireless B computer was present?
I'm not an expert on this, but it is my understanding that any 802.11b devices accessing 802.11g/b devices in a mixed mode (on the same channel) will force ALL devices to communicate at 802.11b speeds.
Close, but not exactly. The "b" stations are approximately 5 times slower than the "g" stations. If any "b" station is constantly streaming data, it will hog the available bandwidth. For every unit of time "g" data is being transmitted, there are 5 units of time in which the "b" data is being transmitted (roughly speaking). The "g" device may be sending at 54 Mbps when it has the chance, but it doesn't have that chance very often. The net effect is to drag the overall throughput down to the range of 11 Mbps. It only takes a few legacy devices to have this effect.
Even if the "b" stations are mostly idle, the "g" stations precede every data tranmission with a CTS message at a "b" bitrate to warn the legacy stations to stay off the air (these stations cannot detect "g" signals). These protection messages slow throughput even if the "b" stations aren't doing anything.