Archived

This topic is now archived and is closed to further replies.

Barn Door

Wasting energy with excess computer code

Recommended Posts

Hi, Does anyone think that writing bloated software has a significant impact on the environment? More code means more enery usage and thus more depletion of natural resources? Is this a valid concern??? I.

Share this post


Link to post
Share on other sites
I'm giving you an award for the perfect title. I was able to get your exact worry when reading it. Considering the number of computers out there, it seems that this has to be a significant problem. However, consider the amount of energy wasted either programing more efficient programs or doing it manually. Humans require quite a bit of energy themselves. I think that this view should reveal that code bloat actually saves more energy than it wastes, but I could be wrong. To actually determine this would require a lot of energy in itself.

[edited by - Thunder_Hawk on May 2, 2003 8:06:45 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
If you have WinXP, or another OS that keeps track of the actual CPU time used by proesses, check how the time is used. You will probably find that if you aren''t playing games, or running Seti@Home, then almost all of your time is spent idle.

Image name h:mm:ss
System Idle Process 6:14:50
explorer.exe 0:03:48
mozilla.exe 0:03:02

Share this post


Link to post
Share on other sites
Not only that, but.. the CPU requires almost as much voltage/amperage while running NOP''s as when running code. I''d be more worried about those that leave their computers on all night while running a 3d screen saver, as that wastes a LOT more energy than a bit of extra code.

Share this post


Link to post
Share on other sites
I''ve never been able to get a straight answer on this, but just how much does it cost to run your computer all the time? My parents are really anal about leaving it on if one isn''t using it due to it''s power usage, but just about everyone I know leaves their computers on 24/7 and they don''t complain about high power bills.

Just how much power is consumed compared to something like a television, in relative terms or raw kilowatts?

Share this post


Link to post
Share on other sites
quote:
Original post by Zipster
Just how much power is consumed compared to something like a television, in relative terms or raw kilowatts?


it should be printed somewhere on the power supply of your PC
it''s probably 200-300 W
i think you''ll find information about power consumption on nearly all electrical devices.

Share this post


Link to post
Share on other sites
quote:
Original post by Zipster
I''ve never been able to get a straight answer on this, but just how much does it cost to run your computer all the time? My parents are really anal about leaving it on if one isn''t using it due to it''s power usage, but just about everyone I know leaves their computers on 24/7 and they don''t complain about high power bills.

Just how much power is consumed compared to something like a television, in relative terms or raw kilowatts?


It''s a fairly significant amount. Your power supply rating will give you a good estimate of the upper bound of how many watts your system is using.

But, if you use your computer quite a bit, then turning it off and on frequently may consume MORE power. Not only in wasted time during boot, but the spike of power used when you first flip the power on. Same reason lights burn out faster if you flip them on and off quickly.

I too would like some real numbers on the subject. Of course, my CPU usage is always near 100%, so I''m not really "wasting" power. Distributed computing is great, whether you are folding proteins, looking for aliens, or finding new primes.

I wish I could get a count of how many NOPs my PC executes per day.

--TheMuuj

Share this post


Link to post
Share on other sites
quote:
Original post by ga
it should be printed somewhere on the power supply of your PC
it's probably 200-300 W
i think you'll find information about power consumption on nearly all electrical devices.



Just because the power supply supports that many watts, doesn't mean it constantly uses it. While your CD-Rom isn't spinning, hard drive is inactive, cpu is doing NOP's, video card is not using the 3d section, etc it won't even use 1/2 of that. The cpu is rated somewhere around 60 watts on average (can be less, can be more, but just an average), so if that's the only thing doing anything, you're not using the full 300-350 listed on your power supply .


--- Edit ---
Stupid double qouting doesn't work properly.

[edited by - Ready4Dis on May 2, 2003 10:35:07 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by Thunder_Hawk
... code bloat actually saves more energy than it wastes, but I could be wrong. To actually determine this would require a lot of energy in itself.



Energy is important. Writing bloated code saves programmer energy but not PC''s.

Im glad that I''m not the only one who uses his pc for 24/7 especially when I got 24/7 internet connection ...

Processor can consume quite a bit of energy but so does many other devices. I believe this amount is bigger than TV but both runs at same time anyway

But it shouldn''t be very horrible amount.

Share this post


Link to post
Share on other sites
PLZ note 300-350 Watts is MAXIMUM amount. After some time(20 mins) of inactivity monitor turns off so it consumes nearly zero energy.

Quite strange effect when it turns on sometimes with no obvious reason. Is it tremors or disturbed gravity fields ?

Share this post


Link to post
Share on other sites
quote:
Original post by Barn Door
Hi,

Does anyone think that writing bloated software has a significant impact on the environment?

More code means more enery usage and thus more depletion of natural resources?

Is this a valid concern???

I.




No I don’t think so. Sitting at the computer all day is good.

You never go visit you friends, no cars.
You never eat, no driving to the supermarket.
You never shower, no energy consumed on warming water.
Nothing else in your apartment is on, light dishwasher...

Well this isn’t the truth for everybody.

Share this post


Link to post
Share on other sites
I''m thinking more along the lines of...

You add a variable to the query string. Lets say its called ''productID''. You could have called it ''prodID''. This would have saved 3 characters.

Thus, everytime someone requests that web page that''s 3 extra characters being encoded with electric charge and three extra copy instruction required to copy the string to the network. These characters could add up taking into consideration all the different websites.

What is a single character worth in trees?

BD.

Share this post


Link to post
Share on other sites
quote:
Original post by Barn Door
What is a single character worth in trees?



I really dont think its worth worrying about. There are many bigger enery problems in the world.

Share this post


Link to post
Share on other sites
I wonder how much energy is being wasted on a silly thread like this one?

Conserving power is one thing but what you are proposing is downright ludicrous. Think of how many man-hours, i.e. extra time/effort, will be wasted in trying to trim variables; creating efficient code is important but wasting time on things like "efficient variable names" is silly.

Also, those of you running your computers 24/7 are not saving any energy whatsoever, regardless if you CPU is running at 100% utilization. You are still wasting power by having your system on all day running at maximum utilization.

Share this post


Link to post
Share on other sites
I think it is much more important to focus on getting people to utilize technology properly. How many people do you know print out e-mails to read them and then toss them? This was quite common in the early days of e-mail and the Internet. E-mail, if utilized properly, should have an immense impact on the amount of paper waste and energy spent recycling paper. Now that is important.

Share this post


Link to post
Share on other sites
Agreed! Computers have the potential to drastically reduce our dependance on "hard" media, but, in practice, the proliferation of computers has been accompanied by an increase in paper consumption. After all, if you only need to type CTRL+P on your computer instead of the retyping the whole page over again on a typewriter, you''re not going to give printing an extra copy as much thought, will you?

Share this post


Link to post
Share on other sites
quote:
Original post by Barn Door
Hi,

Does anyone think that writing bloated software has a significant impact on the environment?

More code means more enery usage and thus more depletion of natural resources?

Is this a valid concern???

I.


Err, no. Just don''t fart too much at your desk to contribute to the green house gases.



The usual stuff goes here for a signature.

Share this post


Link to post
Share on other sites
quote:
Original post by Zipster
I''ve never been able to get a straight answer on this, but just how much does it cost to run your computer all the time? My parents are really anal about leaving it on if one isn''t using it due to it''s power usage, but just about everyone I know leaves their computers on 24/7 and they don''t complain about high power bills.

Just how much power is consumed compared to something like a television, in relative terms or raw kilowatts?

I heard it costs you around $450 to run your computer 24/7 for a year. This was several years ago though, and I have no idea how reliable it is. I just remember hearing it somewhere along the way.

Share this post


Link to post
Share on other sites
quote:
Original post by Digitalfiend
Also, those of you running your computers 24/7 are not saving any energy whatsoever, regardless if you CPU is running at 100% utilization. You are still wasting power by having your system on all day running at maximum utilization.


Explain yourself. How is this wasteful? If your CPU is running at 100% utilization, at least it is performing something useful.

Certainly running a program such as Folding@Home is more useful than your nifty matrix-code screen saver, or worse yet, letting your processor execute billions of NOPs.

And seeing as I use my computer quite a bit, I don''t think I''m wasting excess energy. Like I said earlier, if I had to turn on my computer every time I intended to use it, I would surely be wasting a lot more energy. And when my computer isn''t busy doing what I explicitly tell it to do, it is busy doing background tasks (such as acting as a server or performing some very complex calculation). When you get down to it, I probably waste more electricty sitting at my computer than when I''m away from it. This is because I am likely to be using my drives more--or worse--my 3-D accelerator, when playing a game.

Of course, there is ALWAYS energy wasted...even the human body "wastes" energy. It''s called entropy. In the end everything turns into heat. While computers may speed up this conversion, I certainly don''t see it as a "waste."

As for computer reducing paper use---it isn''t happening, but it should. Too many companies insist on printing hard copies of everything, rather than say, burn the data onto a CD or something. Paper shouldn''t be thought of as being the most secure medium--besides, a fire in the office can destroy all the precious hard copies. I myself rarely print anything out. I have adjusted to reading off of a screen.

Now I want to know if a PDA running an eBook reader is less wasting than printing a 500 page book. If so, then I see no reason why we shouldn''t do away with paper-based products completely---except for toilet paper---until we figure out a good cheap replacement.

I''m all for the paper-less society.

Share this post


Link to post
Share on other sites
What I meant was if your computer is OFF it is not consuming energy; i.e. the person that runs his computer 7hrs a day at 100% utilization uses less energy than someone running one 24hrs a day at the same utilization. That was what I was getting at.

Share this post


Link to post
Share on other sites
Turning the machine on and off wears out parts.
Parts need replacing.
Parts need manufacturing.
Manufacturing uses energy.

I stayed in a hotel that had a sign asking for the TV to be switched off rather than left in standby mode. It said that asa much enregy would be used overnight in standby as whilst watching it for a few hours in the evening.

If this extends to monitors you''re stuck between a rock and a hard place... leaving it in standby mode may still use a lot of energy. Turning it off and on will wear it out.

Share this post


Link to post
Share on other sites
Well within reason. If you turn your machine on 10 times a day I''d agree but turning it on once for 7hrs a day is not going to put anymore wear and tear on the components than leaving it running all the time (i.e. hard-disks spinning for 24/7.)

Share this post


Link to post
Share on other sites