I found this link trying to solve a problem for a class I am taking. My classmates and I have been more than a little confused. The response given tback in 2002 indicated there were some really intelligent people floating around that knows this stuff back and forth. Here is the question
"a. Let's assume that the smallest possible message is 64 bytes (including the 33-byte overhead). If we use lOBase-T, how long (in meters) is a 64-byte message? While electricity in the cable travels a bit slower than the speed of light, once you include delays in the electrical equipment in transmitting and receiving the signal, the effective speed is only about 40 million meters per second. (Hint: First calculate the number of seconds it would take to transmit the message then calculate the number of meters the signal would travel in that time, and you have the total length of the message.) b. If we use 10 GbE, how long (in meters) is a 64-byte message? c. The answer in part b is the maximum distance any single cable could run from a switch to one computer in a switched Ethernet LAN. How would you overcome the problem implied by this?"
I know that cable length for ethernet is limited to 100 meter. I asked my local cable guy for assistance and he told me throughput wasn't calculated this way. Can someone please explain this question to me that makes sense? If I know that 10Base-T runs at 10Mbps, and I know that a byte is 8 bits do I just calculate how fast 512 bits gets to the end of a 100 meter cable and then divide that by the length of the cable? How do I get from bits per second to meters? Sorry,but my head hurts.
Thanks in advance.