Splitting a coax signal

I would like to connect 4 tvs and one cable modem to the cable signal coming into my home. The cable company split the signal just outside my house and brought one run to my computer and one to my tv. I would like to add three additional tvs and would like to know the best way to add the others. Will I need to boost the signal and if so what type of devise would be the best to use. Thank you for any suggestions. Andy

Reply to
AndyLash
Loading thread data ...

Leave the leg to the modem unsplit. Split the TV leg with a 4-way splitter for the TV sets. The cable TV company should give you enough signal strength to manage a 4-way TV split despite a split for Internet. If you see a deterioration in signal quality, have the cable company send out a technician to boost (or unpad) the signal at the pole, at the entrance point, or in your house. If the tech boosts the signal at the pole, you may be able to do a 2-way split in the modem run between Internet and TV.

*TimDaniels*
Reply to
Timothy Daniels

Personally, I prefer to install a drop amp if there are more than 3 outlets. Typically, signal strength at the groundblock is 10-15dBmV. Say it is 10. Lose 3.5 of that for the cable modem and you are left with 6.5 to feed the TV outlets. A 4-way splitter drops that another

7dB, and you are at -0.5dBmV just leaving the splitter. Lose another couple of dB through the cable, and you are at -2.5dBmV at the outlet (depending on distance from splitter). If the level at the groundblock is 15dBmV, you are then at +2.5dBmV at the same outlet, but I think it is better to be above +5dBmV at the outlet. FCC says no less than 0dBmV at the outlet. The splitter for the cable modem should be prior to any amplifier. Cable modems should not be amplified.

CIAO!

Ed N.

Timothy Daniels wrote:

Reply to
Ed Nielsen

I've got a 1-4 splitter at the ground block with the modem on 1 and 3 TV lines on the others. One of the TV lines is split 1-8 farther down (7 in use) with no problems.

Reply to
$Bill

With ~23dB of loss just through the splitters alone (not to mention the attenuation through the cables from the splitters to the outlets), it sounds like you sweet-talked someone into having your tap run out extremely hot just to keep your furthest outlet within FCC specs (or even acceptable). Cable modem has a rather high input level as well, I would imagine (unless it has been padded down).

CIAO!

Ed N.

$Bill wrote:

Reply to
Ed Nielsen

My impression from talking with several techs in the So. Calif. area as they did their installations for customers in our condo building is that they'll give you enough signal strength to accommodate a 4-way split for TV. If they determine where the splitters are put in the residence they'll do what's necessary to *still* give you enough signal strength for a 4-way split on the TV legs and assure proper signal level for the modem. As long as they know what the setup will be inside the home, they'll do that because it's in their company's financial interest to:

1) Assure a good picture on the TV sets, 2) Good signal level for the modem, and 3) Avoid future callbacks.

My impression from conversations with the field tech's supervisors is that this policy may stem more from the supervisors' interpretation of the company's "culture" than written company policy. Given that, I think that the more you can tell the field tech what your ultimate configuration will be, the more likely he'll be to accommo- date your needs in signal strength.

*TimDaniels*
Reply to
Timothy Daniels

My situation is similar to $Bill's. There's an 8-way splitter on the outside of the house. - One leg goes to a 2-way splitter for the front of the basement - One leg goes to a 3-way splitter for the back half of the basement - One leg goes to the office where it meets a 2-way splitter - Off of that, one leg goes to the modem and the other leg goes to a 3-way splitter. - All other legs go straight to TV's

I don't even want to calculate the total losses, but the last cable tech who was here to upgrade my modem raised an eyebrow and mumbled something about a lot of outlets. :)

Modem power levels are good, at +8 downstream, 42 upstream, and 38dB SNR. All TV's are clear, as well. I assume the signal is fairly hot where it arrives at the house.

Reply to
Bill M.

Not at all - there has never been an installer in the house. The only visit was to add my modem when they became available. At that time, they cleaned everything up and ran new cable to the house from the underground, added a ground block and grounded to my breaker box, replaced the splitter with a new 1-4 splitter and gave me 100' of RG6 to run to the modem. The TVs are all still running RG59.

Here's my reasonable, but not great, numbers for today at modem : Tx Power 48.2 dBmV Rx Power -7.0 dBmV Downstream SNR 35.0 dB Downstream MER 33.2 dB

Reply to
$Bill

Cable systems are designed for a signal level of 15-20dBmV at the tap and every tap, splitter, directional coupler, etc. is well documented. They know just what is where. Changing out a tap in order to get more signal at a specific location may very well affect the signal levels downstream as well as upstream, as different value taps have different insertion losses, not to mention that the system would go down while that tap is being replaced. Back to one of the reasons why so many people hated cable TV years ago -- frequent outages.

Systems that have been upgraded to provide advanced services typically run at 20dBmV at the tap, which, after an 85' drop is 15dBmV at the groundblock, which is fine for 4 outlets. Install a DC-9 prior to the

4-way splitter for the cable modem and everything is happy.

If there is 15dBmV at the tap, the same 85' drop turns that into 10 at the groundblock. A 4-way splitter makes that 3dBmV leaving the splitter to feed the outlets. A 50' run to the outlet then drops that down to

+0.1dBmV at the outlet. Still meets FCC specs and if there is a DCT or a pretty new cable ready TV at the outlet and all of the connections are good (includes tight), then all is well. However, if there is a 10-year old cable ready TV set that has less than optimal shielding integrity at the tuner, ingress takes control and locals really suck, as do some of the other channels. If the homeowner or tenant moves the room around and needs a longer cable to go from the outlet to the TV set, they run down to Wally-World and buy one. It doesn't have the shielding integrity needed and pictures are a mess. Ingress is the reason I prefer to have no less than +5dBmV at the outlet, even though the FCC says 0dBmV.

CIAO!

Ed N.

Timothy Daniels wrote:

Reply to
Ed Nielsen

Cut to the chase, Ed - what would be your advice to the OP?

*TimDaniels*

"Ed Nielsen" wrote:

Reply to
Timothy Daniels

Basically the same as yours -- a 2-way splitter (or directional coupler) with 1 leg going to the cable modem and the other leg going to the TV distribution system. If that TV distribution system is merely a 4-way splitter, great. If some of the channels are on the snowy or fuzzy side I would install a drop amp prior to the 4-way splitter.

CIAO!

Ed N.

Timothy Daniels wrote:

Reply to
Ed Nielsen

Sounds good to me! :-)

*TimDaniels*

"Ed Nielsen" wrote:

Reply to
Timothy Daniels

-2.5 isnt "realworld" bad, actually, regardless of FCC requirements... you could gain that back on an extra chilly day.. lol, i think the FCC would forgive 2.5 db +/- I would rather err on the neg side than the positive...IMO, drop amps Are (mostly) A bad idea. 8 times out of 10, you are amplifying garbage anyway...you typically lose 3DB snr through them, and that , as i am sure you would agree , is a more dangerous loss than the Db..... as for 10-15 at the groundblock, on channel 4, maybe.. real world, rarely, unless you are in a lab... now before i am beaten and flogged with multiple posts, let me add that there are plenty of times when an amp is quite necessary.. HOWEVER... this sounds like a situation where i don't agree.. that 10 to 15 db is NOT going to be uniform across the spectrum, and if it is, well then, man, youve got a pretty clean system..and you got lucky. As for plus 5 OR MORE at the outlet? most DTV boxes (and modems) are rated for -11 to +5 db.. above that, the first thing that usually happens is that audio gets goofy, or serious blocking issues.. mostly from signals being too hot, something that gets overlooked way too much... HOWEVER, at 10db, I would personally put a 6 or 9db coupler at the GB, send the tap to the modem, which puts you at a nice, +3 or +1 at the modem, and somewhere around Zero at the sets.

Reply to
TC

That is with the assumption that every outlet in the house has a DCT and that ALL of the cabling (including jumpers) is high quality and ALL of the fittings are high quality and are made up properly and tightened appropriately. Unfortunately, that is seldom the case. The vast majority of cable outlets in the United States are analog only, with the majority of those just being a jumper from the outlet to the cable-ready TV or VCR. In a house with 5 outlets, 3 (maybe 2) of those are likely to be cable-ready TVs. John Q. Public decided to rearrange his room where he watches a cable-ready TV. The jumper isn't long enough to reach where he moved his TV set to, so he goes down to WallyWorld and buys one that has less than adequate shielding. Source of ingress. Rather than have an ugly cable wrapped halfway around his room, he decided to relocate the outlet, so he goes to the nearest home improvement store and buys some cable and fittings, but he doesn't want to spend too much so he buys some screw-on connectors and a 99-cent splitter and cuts the cable that goes to the existing outlet in the room and installs his new splitter there. 4 sources of ingress on that one. With a DCT and assuming that any channels any watched at that outlet, there probably wouldn't be many picture problems with a signal level of

-2.5dBmV. With a cable ready TV set though, picture problems would abound. VHF locals would have multiple images (ghosting) and local UHF channels would cause interference in cable channels from the mid-60s on up. Channels in the upper teens through 21 or 22 would exhibit herringbone and other such lines. Channels 95-97 would have several local FM radio stations making pictures less than watchable. Ingress problems are not limited to poor cable and/or connectors and/or splitters, either. There are alot of cable-ready TVs out there that have less than adequate shielding in their tuners (unfortunately, I have

2 of them). The distribution system itself in the house may be the tightest on the planet, but with a poorly shielded tuner ingress still can wreak havoc.

Most drop amps have about a 2.4-3dB noise figure. That is not Signal to Noise Ratio (SNR), that is the amount of noise the device itself generates. When you add 3dB of noise but increase the signal level by

15dB, you may actually improve the SNR. Simple math: Say you have a signal level of +10dBmV and a noise figure of 2dB. Your SNR is 10:2 or 5:1. Insert a 15dB gain drop amp that has a noise figure of 3dB. Your numbers are 25:5, or the same 5:1 that you had in the first place. The amp had no effect on the SNR. As long as the input level is above the noise figure of the amplifier, you won't experience any noise problems. Gotta stay be below the maximum input level specified for the amp, though.

Cable systems are designed to run out at 15-20dBmV at the tap at their system's highest frequency. Not a lab thing, real world.

Cable modems operate with an input signal level of -15 to +15dBmV. That is not just a "happen to" thing -- that is a specification. That also is the level of the QAM carrier, which is either 6 or 10dBbelow the adjacent analog carrier (depends on whether it is a 64 or 256QAM system).

Yes, a DC-9 or 6 would be preferred.

CIAO!

Ed N.

Reply to
Ed Nielsen

This implies that "Signal to Noise Ratio" is the ratio of the dB levels, i.e. the ratio of the logarithms, not the ratio of the absolute amplitudes. Is that true?

For example, if signal A has a signal strength of 0dBmV, and that of signal B had twice the signal strenght, there would be a difference of 3.5dB (since the log of 2.0 = 0.35, which is 0.35 Bells, or 3.5deciBells). The ratio of the absolute amplitudes would be 2.0, but would your SNR be infinite (i.e. 3.5dB divided by 0)? As you can see, it would change with the value of the reference absolute amplitude (the "0dB" level). Is that what engineers want? If I were an engineer, I *think* I'd be interested in the ratio of absolute signal levels, and when a signal is amplified (i.e. multiplied) I'd just add the noise levels that are expressed in dB of the signal and the amplifier, just as one adds the exponents when multiplying some power of ten. This would imply that the noise in dB is exressed as the dB DIFFERENCE between the signal and the noise. So a signal with 2dB of noise has an SNR of 2dB, and amplifying that signal with an amp having a 3dB noise factor gives the resulting signal an SNR of (2 + 3)dB. IOW, the noise level expressed in dB is additive, which I suspect makes engineering sense.

*TimDaniels*
Reply to
Timothy Daniels

Logarithmic

In your example, Tim, you interchange signal level and signal power. They are 2 completely different things -- signal power is an absolute value while signal level is a logarithmic value. Hence, an doubling of power from 1W to 2W (or 1mW to 2mW or whatever) equates to a signal level increase of 3.01dB (due to imperfections in the manufacturing process, it becomes 3.5dB.

I've just added some rather interesting links that go into power, noise, etc to my website. Take a look at them .

CIAO!

Ed N.

Timothy Daniels wrote:

Reply to
Ed Nielsen

this is interesting, i run into about one non-cable ready tv every three weeks, if that.

John Q. Public decided to rearrange his room

Granted, the ingress will be a problem, but when that customer has gone too far, and created a disaster that he cannot repair, and god willing, he gets the proper cable guy in his house, he will HOPEFULLY ( hah) learn his lesson. ;)

youre right, dead on, but: you really can't, _technically_ improve a signal to noise ratio anywhere but at the source, can you? you'd need a lot more that a drop amp to do that, even within amp specs... i mean , when you amp a signal, you are amping the noise floor too..

please correct me if i am wrong...

i have to admit, i really attempt to shy away from drop amps every chance i get, there are too many situations i have seen where less-than-creative techs use them to boost one lousy outlet of an in house system, rather than do ther job properly, so as a result, i just try to do it as a last resort. in most cases , youre right, the tap is hot enough.

spoken in haste, true true ... after i wrote that i instantaneously regretted having done it, _usually_ there is more than enough to play with at the GB.

+17 on 256, +15 on 64, for most, but this is one place i have to say that, as i see most everyday, this is not usually the case.. mostly, go above +5 on a dht or cable modem, and usually the dht is a LOT more forgiving...and youre gonna have some pronblems... i see it almost every day :(
Reply to
TC

"Signal-to-noise ratio is an engineering term for the power ratio between a signal (meaningful information) and the background noise: SNR = Psignal/Pnoise = (Asignal/Anoise)**2

where P is average Power and A is RMS Amplitude. Both signal and noise power are measured within the system bandwidth.

Because many signals have a very wide dynamic range, SNRs are usually expressed in terms of the logarithmic decibel scale. In decibels, the SNR is 20 times the base-10 logarithm of the amplitude ratio, or 10 times the logarithm of the power ratio:"

Thus the power ratio and the signal amplitude ratio are related in their logarithms merely by a factor of 2 (20 in the case of dBs). This may, or it may not, conflict with your view, but to say that SNR is the ratio of logarithms seems a bit too complex and seems to have no direct engineering meaning that I can see.

Here's another webpage:

formatting link
says: "...signal-to-noise ratio, often written S/N or SNR, is a measure of signal strength relative to background noise. The ratio is usually measured in decibels (dB).

"If the incoming signal strength in microvolts is Vs, and the noise level, also in microvolts, is Vn, then the signal-to-noise ratio, S/N, in decibels is given by the formula S/N = 20 log10(Vs/Vn) .

"...As an example, suppose that Vs = 10.0 microvolts and Vn = 1.00 microvolt. Then S/N = 20 log10(10.0) = 20.0 dB ."

Notice that the SNR is calculated from microvolts (absolute signal amplitude) not as dB's above a reference level.

BTW, your website seems to be down, now, so I can't check out your links.

*TimDaniels*

"Ed Nielsen" wrote:

Reply to
Timothy Daniels

An engineer I am not. I stand corrected. Thanks for the heads-up on the website.

CIAO!

Ed N.

Timothy Daniels wrote:

Reply to
Ed Nielsen

Sometimes we forget about small systems such as rural or small, municipally owned systems that are not digital and perhaps never will be.

Alot of bedroom outlets are analog only, as are kitchen, kids' playroom, shop or garage, etc.

How many of them actually do? ;)

CIAO!

Ed N.

Reply to
Ed Nielsen

Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.