10base-T & POTS in same Cat-5 cable?

Yes, the point is they did *not* recommend CAT3 in the 1980's.

Of course. But "telephone lines" are CAT1 grade cable, by definition. And most certainly 10baseT was *not* designed to use telephone cable. It was designed specifically to use the new CAT3 *data network* cable.

T1/DSX-1 equipment, on the other hand, was designed to use "telephone lines"... However, the only "telephone" grade cable it is normally put on is outside plant cable. Inside plant will be on ABAM, not CAT3.

Reply to
Floyd L. Davidson
Loading thread data ...

Actually, it is up to 120 V RMS. Which is 297 V P-P.

Typicical ring generators actually produce 90 to 105 V RMS.

Reply to
Floyd L. Davidson

The parameter that is modulated or encoded is the only parameter which suffers interference.

Hence if you have an FM system, changes in the frequency will be "noise", and changes in the amplitude will not have an effect. But if you are encoding (as is typical on a wire line circuit) the amplitude of the voltage, then its frequency is not significant but any undesired change in amplitude is "noise".

(Note that I am using a rather broad definition of "noise", which is why I've quoted it. Technically noise is an external influence that is not inherent to the channel, and undesired internal change is "distortion". The difference is that distortion is a quantifiable parameter of the channel and with enough bandwidth it can be corrected, which cannot be done with noise.)

That is because there are frequency sensitive components in the transmission path that reduce the *voltage* of the noise.

Essentially, voltage noise can be filtered out *if* it has some other parameter that allows you to selectively reduce its level while not affecting the desired signal level.

And the same applies in systems which encode or modulate some parameter other than voltage. For example in an FM system the voltage differences are commonly filtered out by limiting amplifiers, which removes most of the frequency interference if the voltage levels are sufficiently distinct! Same as we are discussing with 10baseT, except "frequency" and "voltage" are reversed as to which is signal and noise and which is the "other" parameter that can be filtered.

See above... same effects, same differences.

Isolation for *what*? Breakdown, where arcing occurs is one thing, but blocking the receiver is another. The isolation rating has no relationship to the dynamic range of the receiver input. And regardless of the dynamic range, it simply makes *no difference* what the frequency of the voltage is at the input to the receiver.

Ring voltage is commonly 90-105 *RMS*. It is allowed to be as high as 120 VAC RMS.

Current causes induction. The field induced results in a *voltage* being detected at the receiver (otherwise, it doesn't exist!).

That is why 10baseT, and similar protocols, are designed for roughly 100 ohm cable, rather than say 2000 or 20000 ohm cable.

Which is the reason it uses twisted pair, which provides high common mode rejection of induced noise.

Inside a shared sheath that is properly wired yes, and that is

*exactly* what the specifications are intended to provide. The point here was originally stated as what the effect will be if there are kinks or other damage to the cable or miswired connectors that split a pair.

When you talk about using a facility that has greater than 60 dB common mode rejection and use the functional parameters to suggest that therefore a directly connected burst of 250-300 P-P ring current at 20 Hz won't be a problem, it is not logically valid.

The 10baseT receiver can handle 20 Hz voltage when it is first reduced more than 60 dB by common mode rejection and then an additional 20-40 dB by the high pass nature of the transformers used. The resulting voltage reaching the receiver is significantly lower than the desired signal. But if the "more than 60 dB" from common mode rejection is not there, that voltage is serious competition for the desired signal, regardless of what frequency it is.

Reply to
Floyd L. Davidson

And not a word about harmonics...

Everything from switching transients to power supply noise.

Then you'll be very surprised.

Don't bet the farm on that...

I'd say that a 300 volt P-P signal is in fact "an enormously powerful signal", compared to the desired signal, wouldn't you?

So, you can't cite any spec to support what you've said.

Off hand I don't know what the dynamic range of the receiver input is, but the desired signal is a maximum of 2.8 V P-P, so I can't imagine that the dynamic range is required to be much more than twice that. However, it should also be clear that *any voltage* greater than 1.4 V at the receiver input is going to absolutely interfere with the ability of the receiver to detect a valid crossover in the desired signal.

Obviously a 300 P-P burst of noise has to be greatly attenuated before it reaches the receiver, or it will cause errors. The

60+ dB of common mode rejection resulting from properly balanced UTP transmission facilities is the major protection against ring current. An impaired facility is not immune to interference from 20 Hz ringing voltage from telephone equipment just because it is not in the same frequency range as the baseband for 10baseT.

Wanna bet the farm on that?

The ring voltage is also superimposed on the "-48 VDC" battery, which can actually be anything from 24VDC to as much as 70-80 VDC. So add that to the 300 P-P signal. Obviously the *normal* range of voltage can commonly approach 200 V absolute, and typically spikes occur that go *much* higher.

....

Nice drawing. Let me show you how to accomplish that:

This circuit is of course so commonly used in telephone signaling equipment that everyone has seen it a million times. It appears to be very simple, and yet making it work properly is not at all simple. (And no I'm not going to go into details.)

The "off-normal" contacts short circuit the telset. That has two advantages, one of which is to remove the extremely annoying pulse interference from the receiver. But the other is to remove the DC resistance of the telset from the loop to improve the pulse characteristics.

Reply to
Floyd L. Davidson

Exactly. Your original statement was "Every other bit pattern would fall between those extremes.", which is not correct. It will be one or the other of those extremes, period. No in between.

Under normal circumstances, that is correct. We are talking about what happens when there are impairments though. The idea that 20 Hz ring voltage will not interfere just because it is a different frequency is simply not true. If it gets to the 10baseT receiver it makes *no*

*difference* what the frequency is... only the voltage level counts.

With normally functional equipment, virtually no 20 Hz voltage gets to the receiver. With kinks, split pairs, and other cable impairments, it

*can* get to the receiver, and when it does the fact that it is 20 Hz simply has no significance at all.
Reply to
Floyd L. Davidson

Who said *anything* about it referencing the number of pairs?

Just as I said above, the extra pairs are intended to be used for voice pairs. We've been discussing the use of RJ-45 connectors, which have 8 pins for 4 pairs, and the above comment of mine simply refers to that configuration, and has *nothing* to do with what you are responding to. Regardless, most of your response if incorrect anyway...

What has any of that got to do with anything?

The 25 pair cable used for inside plant is *not* CAT3 (and does not meet CAT3 specs in several respects). 50 pair cable is fairly rare, and is *not* "also often used". The 25 pair cables used to wire channel banks are of course *only* used for the analog Voice Frequency side. ABAM is used on the digital side.

Anyone who uses 25 pair telephone cable for 10baseT data, or even for T1/DS1 data, should be on the layoff list. T1/DS1, if it spans more than 5 racks in the same row or to a different row, is supposed to be individually shielded pairs (ABAM). Can you imagine the effect of wiring an office with CAT3 instead? (There is a fundamental difference between a CO and a customer location, and the sheer volume of circuits is what makes it significant. A few runs of CAT3 is one thing, and thousands of them is entirely different.)

CAT3 dates at 1991. Sounds like someone was carried away with the emerging standards, and not paying attention to either the required specs or the price tag.

Reply to
Floyd L. Davidson

Exactly. And you can perhaps remember that it didn't say that CAT3 was the typical standard prior to about 1990. And for that matter, it might even have mentioned what the ISDN acronym stands for? "It Still Does Nothing"

Unfortunately the telecom industry as a whole simply ignored ISDN, mostly because as a unit the industry was simply ignorant of data networking and data communications. The typical response to questions about implementation of ISDN was "Who's going to pay for it? There simply is no market."

Of course for anyone who had half a thimble full's exposure to computer networking at the time, that was the sound of pure idiocy! And it was *exceedingly difficult* to deal with that attitude throughout the industry while watching the market they claimed not to exist went to others.

The telecom industry did not respond to the need for data communications, and instead the modem industry went from 1200 bps modes to 2400, then it v.32, V.34, and finally to v.90, and reaped millions, while the telephone companies all told themselves that nobody needed or would buy a data line faster than 1200 bps!

Not that it was equally bad everywhere, but there was no momentum even where it was tried (Europe, for example), and it was almost universally as bad as it gets virtually everywhere in the US.

Sure. It will work at reduced capability. But if you want to hit the limits (on things like distance and BER numbers), you can't do that. (I used 75 ohm coax for ethernet for years, because I had access to lots of it, and did not have any needs that stressed the specs. I certainly would not recommend that to anyone with critical requirements though!)

Europe, and particularly in your part of Europe, has virtually always had somewhat higher standards for local loops than those used in the US. But of course all of us have seen poor implementations... Someday I'll describe the telephone system I saw, in a guy's damp basement, in Eagle Alaska. ;-)

But that has nothing to do with the "essential spectral components"!

All that is required are two sine waves, one at 5 MHz and one at

10 MHz, and the switching transients produced by keying from one to the other. That is essentially 180 degree phase change at the start of a bit, so it could be looked at as phase modulation too. (I'm not sure what the modulation products would look like, does 5 MHz and 10 MHz with harmonics sound right??? :-)
*That* is the point which is essential to the previous discussion, not what the transmitted signal looks like.
Reply to
Floyd L. Davidson

Where'd the 120V come from? Phone ring generators are nominally 90V, as you then say in the last line.

Reply to
James Knott

Who was asking about damaged cable? The original question was about running phone and ethernet over the same cable.

Reply to
James Knott

The specifications.

Reply to
Floyd L. Davidson

I stand by my original (1st responder) reply :-)

Reply to
Rick Merrill

10BASE-T contains a minimum electrical *isolation* requirement of 1500 V rms, per IEC 60950. However, this only means that the isolation transformers (present on all 10BASE-T signal lines) must not break down under this stress. It does NOT mean that 10BASE-T devices must survive 1500 V presented differentially across a given pair.

There is no specification for the maximum differential voltage that must be sustainable across a 10BASE-T receiver input, although clearly:

-It must be able to sustain (and decode) a maximum-amplitude transmitted signal, which is 2.8 V peak. (IEEE 802.3 section 14.3.1.2.1.)

-It must be able to reject sinusoidal signals with frequencies under

2 MHz and amplitude up to 6.2 V peak-to-peak. (IEEE 802.3 section 14.3.1.3.2(b).)

As to tolerance of traditional telephony signals inadvertently presented to a 10BASE-T receiver, the standard notes the following:

-Battery voltage is generally 56 Vdc applied through 400 ohms.

-R "Although 10BASE-T equipment is not required to survive [much less operate during] such wiring hazards without damage, application of any of the above voltages shall not result in any safety hazard." (IEEE

802.3 section 14.2.7.4.)

That is, putting a ring signal across a 10BASE-T receiver may cause the receiver to be permanently destroyed, but it should not be a safety hazard, i.e., cause personal injury.

Most 10BASE-T receivers put a pair of back-to-back zener diodes across the input, which clamp voltages far in excess of the expected signal levels. However, if the source impedance of the high input voltage is low enough (i.e., there is enough *energy* available), the zener diodes will be destroyed by the clamping current, followed by the destruction of most of the rest of the input circuitry. Puffs of smoke and faint smells of burning carbon resistors may follow.

-- Rich Seifert Networks and Communications Consulting 21885 Bear Creek Way (408) 395-5700 Los Gatos, CA 95033 (408) 228-0803 FAX

Send replies to: usenet at richseifert dot com

Reply to
Rich Seifert

I think this statement of mine is particularly interesting in light of the actual specification that Rich has so kindly contributed.

None of it addresses that point, and I'm not sure that James, or perhaps others, yet understand the significance of it.

Regardless, here are comments as to how the 10baseT specifications relate to telcom specifications and general practice.

That fits *precisely* what I expected, as stated originally. It appears they gave it 4 dB of head room, and I suggested that it wouldn't be much over 3 dB.

"Generally" it would be 52-54 VDC, and "generally" would not exceed 56 VDC (a 48 volt plant on boost charge). However, I've seen subscriber line carrier systems that used 75 volts! And of course that odd circumstance has to be designed for.

Note that the 175V is peak with respect to ground, and what David Lesher and I referred to was P to P. They are saying P-P is as much as 350 VDC, which is even higher than we stated. (In fact, maximum is 120 V RMS, which is 170 Peak, or 340 P-P. But typically ring voltage is actually no more than 105 V RMS, or

149 Peak and 298 P-P.) They are citing the maximum possible, and we cited the maximum commonly to be seen.

And under any circumstance the receiver is unlikely to function while that voltage is present, even if it doesn't destroy the device (which it may in fact do).

I doubt that most telephone lines would have a low enough distributed resistance to allow that with interrupted ring current, though it might well be possible if the line has a very short loop (in the next room or the same closet with a PBX, or in the CO equipment room). But if 60 Hz power line voltage was the source, that would probably happen very quickly.

Thanks for posting the 802.3 specs Rich! I gave a half hearted attempt at searching google, but of course there are too many hits to sort through and I never had the time to look at enough of them to see if any had the interesting part.

Reply to
Floyd L. Davidson

The two are (necessarily) the same.

Current is correct... to the degree that it translates to voltage! E=IR. Which is to say that current, with the right impedance, will produce a voltage that does in fact interfere.

That just isn't true. The receiver is a voltage sensitive device. End of story.

Arcing? From a ringer? I've never heard of such.

Ohhh! You are thinking of DC buzzers! But telset ringers don't work that way. No contacts, no arcs, no sparks, and no signal generation at all.

Exactly.

Stop playing word games. What you are saying is merely repeating *exactly* what I just said, with one small added detail that is insignificant in this context. What's the point?

I assumed we are all aware of just exactly how current causes induction (with a varying magnetic field), and there is no need to recite AC theory, as opposed to just saying "current" when clearly most of the discussion is about current that varies at

20 Hz per second.

I don't need to detail all of this down to a gnat's ass level just to see if you can try to knit pick at something I've left out. We don't need to write the definitive text book with each article.

Trust me, splitting voice is *precisely* the same as splitting data: a total disaster.

I agree. One of the problems with this type of thread is people who have hands on experience with pushing the limits but little understanding of the theory involved, and thus have no perception of why something worked in one case but might not in another. So they say, "Hey... we did it. It works fine." and someone believes that and spends many dollars and/or hours finding out just exactly what the differences are!

However, for anyone who is perceptive, there is a lot in this thread to learn from...

Reply to
Floyd L. Davidson

For example in Finland the telephone wiring recommendation book "Puhelinsisävekkokirja 1989" recommended residential building internal telephone network beign wired with MHS 3 pair wiring or similar. Also cable MHS 1 x 4 x 0,5 was used in the 1980's. Those cables are approximately CAT3 in their rating. Also the VMOHBU cable used in buried cable installation is sufficient for Ethernet use according what I have read on the topic.

Maybe the official CAT3 definition was did not come long before 10Base-T ethernet, but the cables meeting this specification had been made and installed years earlier than that. At least on some countries.

Reply to
Tomi Holger Engdahl

What the typical telephone cable indide house is depends on the country and the year when the building is built. Some countries recommended better cabling than CAT1 on 1980's for example to be prepared to future technologues (ISDN at that time). Some of the cable installed to offices and residential buildings from 1980's on has been pretty high quality (CAT3 rating or almost like it). I know the situation in Finland best. I wrote one paper for one course at Helsinki University of technology on ISDN technology (around 1995), and when writing on that I read quite carefully one publication from Helsinki telephone company that described the suitablility of existing wirings for ISDN use.. gave idea what kind of cable was use on buildings in the last decade or two..

In Finland some people are using the old telephone wiring to run

10Base-T or 100Base-TX Ethernet on them for networking purposes. There is information on that at
formatting link
At some countries there has been practice to install higher quality cable, and some used the cheapest you can get (I have see some installations in Russia and i doub that if they world even qualify even as CAT1).

True.

True.

The transmitted signal on 10Base-T is essentially square wave with some pre-distortion added to it and then filtered through the low pass filter that revoves all highest frequency components (so that system passes the noise limits on above 30 MHz noise). So the the signal from 10Base-T transmitter can be viewed as slightly distorted swuare wave signal.

Reply to
Tomi Holger Engdahl

10Base-T Ethernet transceivers and Ethernet cards are typically designed with around 1500V isolation.

That 500V isolation level was used on Ethernet that used coaxial cable.

I have not found any ring signal relared problems for Ethernet at tests where I have had 10Base-T and PSTN line signal on a shared sheath. I have not made any wide tests on this though..

I have even tested application where you put normal telephone signals and Ethernet signals on the same wire pair. Just two small capacitors for block DC + attenuate low frequencies for Ethernet input/output. And then a suitable low pass filter for telephone line signals input/output. Worked at least on laboratory setup without problems for 10Base-T Ethernet. No packet loss because of ring current... On hook/off hook situation and pulse dialling were more challenging signals for this setup, but di dnot cause great problems either.

The setup was like this (idea from Petri Krohn):

Computer Ethernet(Tx) /----||--------- / /---||--------- CO ________ / / ________ ----|LPfilter|----/----~~~--/-/--|LPfilter|---- ----|________|---/-/---~~~---/---|________|---- / / Telephone equipment ----------||---/ / ----------||----/ Ethernet switch (Rx)

Ethernet switch (Tx) Computer Ethernet(Rx) _______________________________________________ _______________________________________________

Reply to
Tomi Holger Engdahl

Yes, but I didn't ask what was interfered with. I asked what determines interference.

My answer is current. Most particularly HF changes in current. Especially with sharp edges that have high freq components.

Voltage has nothing to do with it except as it _might_ increase current. But with modern semiconductor telephone sets, I doubt even a peizoelectric ringer generates the varying current (&arcing!) that a mechanical contact bellringer does.

Which is what balanced signals and twisted pair media provide.

Not quite. Current produces a magnetic field. A _changing_ magnetic field induces voltage in a separate conductor that leads to secondary current. So only changes in current cause induction.

I didn't catch any reference to split pairs. They are horrible! (Although splitting voice isn't as bad as splitting data). Anyone pushing to the limit with shared sheath had best know what they're doing.

-- Robert

Reply to
Robert Redelmeier

Floyd L. Davidson wrote:

Most such noise tends to fall off well below about 30 MHz. It's still a long way to 1 GHz from there.

No, I don't have such specs handy, but all such equipment must be built such that a safety hazard is not created. It applies to not just computer equipment, but also communications equipment, appliances etc. However, that has nothing to do with interference, only safety. As far as interference goes, when using the same cable to carry voice and ethernet, is interference from voice to ethernet. Voice circuits will be limited to a few KHz and ringing is 20 Hz. The circuitry on the NIC will be built to pass the ethernet signal, which is in the range of 5 -10 MHz. It is not built to pass voice frequencies or lower. So in order for interference to occur, you have to have enough of that voice enery coupled into the ethernet cable, and then get through a transformer and other circuitry, that's designed to pass signals 1000x higher. Now, IIRC, a single pole (for example a simple RC network) filter, has a rolloff of 3 dB / octave going from 4 KHz, to 5 MHz, is roughly 10 octaves or 30dB attenuation of the highest audio frequency, when passes through that simple one capacitor, one resisitor high pass filter. A network card would likely have more filtering than that, even without the transformer, which simply won't work very well at audio frequencies. Care to explain how an audio signal, nominally in the range of a few dBm, after passing through so much attenuation will interfere with a signal that's several dB higher than the original audio signal? Incidentally, for the 20 Hz ringing, you'd be looking at over 60 dB attenuation, when going throug a single pole high pass filter. That's a 1,000,000 to 1 ratio. How much of that ringing current, will appear at the receiver???

A voice signal is a few dBm or dB above one milliwatt, which would be on the order of 1V, with 600 ohms impedance. Ethernet cable is roughly 1/4 that impedance, so the voice at that level will be about 1/2 V. After you've managed to couple the signal from one balanced pair to another, the induced signal will be much lower. It will not the be original 1/2V. Ringing current will be attenuted even more. So, please explain how you're going to get crosstalk, from one pair to the next, that's powerful enough to cause interference.

Also, given all the claims you've made about noise and interferece, I can only conclude you've got some incredibly horrible phone lines in Alaska. In fact I'd have to consider them useless.

However, the original question was about sharing ethernet sharing a cable with phones. According to Rich, one of the ethernet designers, that sort of service was part of the design for 10baseT. No amount of arguing from you is going to change that fact.

Reply to
James Knott

I fully understand that point. My question has been how does a voice signal typically around 0 dBm or 1 milliwatt or a 20 Hz ringing current, in another pair in the cable, manage to induce such voltages in the detector? You've got to consider that as a result of the twisted pair, there is very little crosstalk to begin with, followed by a transformer that's designed to pass ethernet signals and not voice, followed by whatever filtering is in the receiver. How does a standard voice signal manage to get through all that and still produce a signal of sufficient amplitude, to cause the interference???

Reply to
James Knott

Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.