The ON/OFF power debate and "energy independence"

Here are my reasons for leaving my ADSL2+ modem OFF when the computer is off. My modem runs hot and consumes 5-7 watts; typical for external AC-adapter-powered units.

If I power the modem on/off once a day, that's only 365 cycles per year. How many thousands of cycles are these modems rated for? I've been turning my entire computer on/off daily (more than once) for 6 years will no known ill effects. I usually use Hibernate mode to speed startup times. All my old computers were running fine at the time I upgraded.

Modems are solid state, but even hard drives are often rated at

50,000 power cycles, which is 136 years of daily cycling! Are people still insisting that they'll fail before the computer is obsolete? When drives do crash, how many times in recent years has it been proved to be a power-cycle issue?

Those old "cyclic shock" tales seem to come from the days of discrete components and hard drive heads that had to be parked. Hand-me-down stories tend to linger through hearsay. It doesn't seem logical that electronics will last longer when everything's constantly cooking. If something is rated for N power cycles, why stick with the old stories? It always seems to be some Right- wing ruse about conservation being unnecessary.

I'm not running a server so nobody else cares if I take the modem offline. For general privacy I also don't want the same (dynamic) IP address for long periods of time. It makes more sense to be totally offline when the computer is off. It also feels safer in terms of power/line surges. The only thing I leave on all the time is my TiVo, because it forces me to.

This aspect has me curious: My ISP's tech support suggests leaving the modem on but they admit it will grab another IP address automatically. The modem synchronizes in 15-20 seconds when I power up, so why would they care? Is there a reason they ask you to waste energy with a modem you paid for? Can it impact their network when someone grabs a new IP address once or twice a day?

I just don't agree that it's better to leave something on and waste energy for the sake of minor conveniences. The excuses usually come people who think consumption equals prosperity. I LIKE the idea of saving energy, no matter how "insignificant" it may seem. Americans have the world's highest per capita energy consumption and keep making excuses for it. In truth, every "small" reduction will help if millions of people are doing it.

N.C.

Reply to
Anonymous via the Cypherpunks
Loading thread data ...

No right wing conspiracy. Or a left wing one where they want to monitor us 24/7. Thermal shock is real but rare. But real. You'd likely consider me right wing but I have dimmers all over and florescent lights in many places. :)

Basically Tivo is a server. :)

If you have more than one computer in the house or a computer used by folks who don't understand networking, modems, etc... then leaving it on can make domestic life much more bearable.

When the reconfigure their network or want to re-flash the modems firmware or settings, if it's on at 4 am, they are less likely to cause complaints if they do it then. And if 100 typical users turn the modem off when they aren't using the computer 10 will forget to turn it on at first and one of those will call tech support. Been there. Got the hat.

I go for the big ones. Attic ventilation to reduce AC costs. Spending extra on the water heater, ceiling fans instead of AC when possible. Etc... If I keep the computer and Tivo running without effort by the "others" they listen better on the big ticket items. :)

Although it would be nice if more "wall warts" would notice when power is not being drawn and shut down most of their draw. How many cell phone charges, microwave clocks, on by remote TVs, cordless phones, etc... do you have that have their "wall warts" plugged in all the time?

Now back to the power of the modem. 7 watts. That's 1/14th the draw of a

100 watt bulb. So turning off a 100 watt bulb 2 hours a day will cover the cost of a day of modem use.
Reply to
DLR

Reply to
kingpen

Reply to
kingpen

"kingpen" wrote

It's all about good stewardship of natural resources and the environment for future generations. Are you not aware that 90+ percent of the power generated now burns oil, coal or natural gas to run the generators? None of those things are in infinite supply and burning them creates air pollution and global warming.

There are about 220 million people in the U.S. now. If only half of them left one unnecessary 5 watt device running all day, it would waste 1 million, 100 thousand Kilowatt/hours each day.......and a correspondingly huge amount of oil/coal/gas. Alas, the average American probably wastes a LOT more than 120 watt/hours each day.

The ultimate fate of mankind is that we all will freeze to death, in the dark. That fate is hastened by morons who think "there is an abundance of electrical power".

Reply to
Ken Abrams

On Sat, 26 Aug 2006 22:58:47 GMT, "Ken Abrams" wrote Re Re: The ON/OFF power debate and "energy independence":

Just think of how much *you* wasted by having you computer turn on while you type that s__. Lead by example. Turn off your computer now.

Reply to
Caesar Romano

Why bunt when we can go for homers? I've put dimmers on most of my lights or converted them to florescent. Lights in larger rooms are on multiple switches. Plus split my outdoor floods so only half of each unit is on a motion sensor and the other half a hard switch for when needed. On my power bills I've knocked a kilowatt or two off my daily usage. I'm NOT going to create a hassle getting on and off the Internet to reduce that by another 1%. :)

Reply to
DLR

Caesar Romano wrote in news: snipped-for-privacy@4ax.com:

Who told you that conservation means extreme deprivation? It's no big trauma to turn off a switch or two at night. The problem is that people don't think about where energy comes from, or will come from.

Jim

Reply to
Jim

"Ken Abrams" wrote in news:Hm4Ig.896$ snipped-for-privacy@newssvr25.news.prodigy.net:

It's actually coming up on 300,000,000 this year. Population growth is defeating most efforts to conserve.

Jim

Reply to
Jim

I suspect that's orders of magnitude less than the amount of energy that could be saved by folks switching from SUVs to small, fuel efficients vehicles.

Yes. The US workplace often encourages this as well -- I've worked places where, for instance, the copy machine's heater was left on overnight because the lady who used it didn't like the ~5 minute warm-up in the morning if it was set to automatically power down after some hours of non-use. I've worked places where all the PCs were left on overnight due to automated backup processes that were going to be run -- a good idea, but something that can be made to use far less energy if the "remote wakeup" capabilities that network cards have had for years now were used.

The universe is a big place... I wouldn't count on us all freezing to death any time soon.

I do think, however, that over time -- especially in the U.S. -- we'll see the return of nuclear power. I honestly think the fact that the U.S. can *choose* not to use any significant nuclear power is a *luxury* that we enjoy, largely at the expense of other countries.

But in any case, regardless of how much damage we may or may not do to the environment and how many resources we deplete, I'm confident that humans are smart enough that running out of energy is one of the less likely causes of our species going extinct.

Sure, but when we're talking, say, 10,000 years vs. 20,000 years, it's difficult to really worry that much about it.

If you personally chose not to have any children, it'd let a couple of other people use up an other lot of excess energy in their SUVs, air conditioners, etc... ;-)

---Joel

Reply to
Joel Kolstad

At least there's some solice in that the more developed a country is, the slower its population growth rate.

After getting rid of tyrranical governments and political corruption, technology is one of the strongest indicators of development...

Reply to
Joel Kolstad

"Joel Kolstad" wrote

This actually is a paraphrased quote from some respected scientist quite a few years ago. The context was really more about a "nuclear" winter, brought on by war, volcanos or an asteroid. Actually running out of energy, either because of supply exhaustion or from delivery problems brought on by large non-nuclear wars is, I think, also a real possibility.

Looks like we agree on a lot of points here but I think your time horizon is WAY too long. Barring a major breakthrough (like cold fusion), I think the problem starts to get critical in hundreds of years, not thousands and certainly not tens of thousands.

Reply to
Ken Abrams

While I don't expect us to invent some "magic pixie dust" to fix all the problems of our planet any time soon, Look back 50 or 100 years and you'll see we have a habit of improving the technology we use at a fairly rapid rate. If we were still using 1920s technology to day, no one in the US could still breath. Steam engines, coal for most non-auto energy, very low mileage per power output for autos, etc...

Reply to
DLR

If a harddisk is rated for 50000 power cycles, and you have a town of

1000000 people using this kind of harddisk in their computers, then you end up with 20 unlucky fellows every day. ;-)

I agree with you, we should not waste energy. ICs with power save mode and automatic power-off option only cost cents. Unfortunately most ISPs give their customers only dumb modems with dumb wall plug power supplies from the past century.

If a possibly unmonitored hot modem goes up in flames every 50000 days (136 years), in average, then again you may find 20 unlucky fellows per day per 1000000 customers.

Reply to
biggerdigger

I'm with you NC on conserving power. I've been known to use my 30 watt old laptop instead of firing up the much faster desktop just to use less energy. And I just dumped my desktop (2 days ago) in favor for a new laptop. Although it uses far more than 30 watts (actually 60 watts I believe). It sure beats 250-350 watt desktop with a monitor (100-250 watts). My flat panel monitor uses 100 watts, and they are supposed to be energy efficient (they are, but only about half of the power of a CRT) And my new laptop is faster then my old desktop anyway.

Now far as power cycles MTBF (mean time before failure) specs is pretty much worthless when talking about an individual unit anyway. As the specs is only an average over a given lot anyway. Some last longer and some last shorter.

And yes it is true the company tweaks and does firmware upgrades in the middle of the night, which wants your DSL modem on. Although the biggest important time is like the first 10 days of service anyway. After time, less and less tweaks and firmware upgrades anyway.

Having said all of this, I would turn mine off if it was connected to one computer. But it is networked with a half of a dozen computers between two floors. And it would be a pain in the ass to run upstairs where the DSL modem is just to use a computer downstairs. I could live with it, but others would bitch about.

Reply to
BillW50

On Tue, 29 Aug 2006 17:17:32 -0500, "BillW50" put finger to keyboard and composed:

Record how long your laptop runs on a fully charged battery. Then divide the battery's amp-hour rating by the number of hours of run-time. This should approximate the laptop's average current draw. Multiply this by the battery voltage and you have a ballpark figure for power consumption. Add 25% for losses, or whatever figure is appropriate.

The stickered ratings do not represent actual power consumption.

For example, my Athlon XP 2500+ desktop system consumes about 150W -

175W when it is working hard. Otherwise it idles at around 75W.

formatting link
My 15" LCD monitor consumes only 17W.

I'm using a wattmeter for my measurements.

- Franc Zabkar

Reply to
Franc Zabkar

4800mAh battery ran down to about 48%, let's say 50%. It ran for about 70 minutes. So that is about 2160mAh per hour. It's a 10.8V battery. So that makes it about 23.3 watts. Now add what? 25% for what?

Let me do this again for what the battery is supposed to do and I believe is correct. 4800mAh and it is supposed to run for about 2 1/2 hours. That's 1920mAh times 10.8v equals 20.7 watts.

Naw... I ain't buying this! I don't use it on a battery, but on AC and the output of the power supply is 65 watts. The input is worse (like usual) 100v-240v @ 1.7 amps. Which is 170 watts. I need to throw my amp probe on this thing to know how much it is really using.

formatting link

What size power supply do you have there Franc? And I have a 19 inch LCD monitor and I can feel the heat of a 25 watt bulb coming out of the back vent slots. Damn its power supply is basically the same as my new laptop. 19VDC @ 3.68 amps (LCD) vs. 19VDC @ 3.42 amps (laptop). I now know where I can get a spare power supply for my laptop. :D

Okay Franc granted that I need to amp probe this stuff. But I probed my old HP PIII 533MHZ w/256kb of RAM machine and it was more than I thought. I don't recall exactly what it was, but way more than 100 watts. I'm thinking it was like 150 watts. The power supply was like

220-250 watt range. And standby was quite high as I recall. Something like 90 watts or something. And I believe 90 watts doing nothing is a lot. And I recall my CRT monitor was really bad. At least something like 170 watts. That just crazy!

The reason why I say my old Toshiba laptops only use 30 watts is because that is what the supplies are rated for. And while I have an universal

30 watt supply which works those laptops just fine from a cigarette lighter socket, there is no way that same 30 watt supply can run this new Celeron M 1.7GHZ! Even though the figures only show it is only using less than 25 watts of actual battery power. That's not the truth from the DC input connection.

As for your 15 inch LCD monitor (that's small Franc, my laptop is a 15.4 inch) only using 17 watts. Well I don't know... that sounds too low to me. Have you adjusted the brightness and does it change the power consumption a lot? It is supposed to make a difference in laptops when you darken them running on batteries. But I don't know if it makes that much difference. Like if I can run 2 hours vs. 2 1/2 hours on a charge, I don't think I really care. Just my 2".

Reply to
BillW50

Loss in the power adapter, loss in charging the batteries. Notice that the power adapter gets warm, the batteries get warm while charging. That heat represents the power loss.

The power adapter has to be able to both charge the battery and run the laptop at the same time, worst case. Plus again the loss in the adapter, losses in the laptop power converters, and losses charing the battery.

Jerry

Reply to
Jerry Peters

On Thu, 31 Aug 2006 06:10:42 -0500, "BillW50" put finger to keyboard and composed:

You are confusing current with charge. A battery's amp-hour rating reflects the amount of charge that it can store. A rating of 5Ah, for example, means that a battery can supply 1A for 5 hours, or 5A for 1 hour, or 2A for 2.5 hours.

The higher rating is to supply peak demand. Although the average consumption is only about 20W, it appears that the laptop's power usage can peak at twice or three times that figure during CPU intensive periods. The AC supply also has to be able to charge the battery *and* power the laptop.

For accurate results you really should use a wattmeter rather than an ammeter.

IIRC, someone in another newsgroup recommended one of these gadgets (Kill a Watt):

formatting link

formatting link
>

It's an Antec 350W, but that's largely irrelevant. The actual power consumption of the PSU depends only on the current draw of the load, plus a little extra to allow for inefficiencies. Typical efficiency figures are 75%-80%.

My 15" monitor has an internal PSU, so I can't compare numbers with you. However, the AC adapter for my 17" LCD monitor is rated for 12V @

4.16A, ie 50W.

FWIW, the rating sticker for my 15" LCD monitor reads "100-240V,

50-60Hz, 1A".

Just watch the polarity.

My 32" TV consumes 84W when muted with a black screen, while a max bright screen at normal volume levels consumes 140W. A small 34cm TV consumes about 48W when muted and with no signal input (snowy screen). My 17" CRT monitor needs about 95W when displaying a white screen in MS Word.

My other PC has a 17" LCD monitor. I'm typing this on my old socket 7 Internet machine.

At max brightness the power consumption is 22W, at 50% it is 17W, and at min it is 13W. The corresponding results for my 17" LCD monitor are

37W, 32W, and 29W.

I'm using this device for my measurements:

formatting link
I've calibrated it against known resistive loads.

Other posts at alt.comp.hardware.homebuilt tend to confirm my results for desktop machines.

- Franc Zabkar

Reply to
Franc Zabkar

Do what makes you happy. Be happy.

-Frank

Reply to
Frankster

Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.