Wasting energy with excess computer code

Started by
36 comments, last by Barn Door 20 years, 11 months ago
Hi, Does anyone think that writing bloated software has a significant impact on the environment? More code means more enery usage and thus more depletion of natural resources? Is this a valid concern??? I.
Advertisement
I'm giving you an award for the perfect title. I was able to get your exact worry when reading it. Considering the number of computers out there, it seems that this has to be a significant problem. However, consider the amount of energy wasted either programing more efficient programs or doing it manually. Humans require quite a bit of energy themselves. I think that this view should reveal that code bloat actually saves more energy than it wastes, but I could be wrong. To actually determine this would require a lot of energy in itself.

[edited by - Thunder_Hawk on May 2, 2003 8:06:45 PM]
______________________________________________________________________________________The Phoenix shall arise from the ashes... ThunderHawk -- ¦þ"So. Any n00bs need some pointers? I have a std::vector<n00b*> right here..." - ZahlmanMySite | Forum FAQ | File Formats______________________________________________________________________________________
If you have WinXP, or another OS that keeps track of the actual CPU time used by proesses, check how the time is used. You will probably find that if you aren''t playing games, or running Seti@Home, then almost all of your time is spent idle.

Image name h:mm:ss
System Idle Process 6:14:50
explorer.exe 0:03:48
mozilla.exe 0:03:02
Not only that, but.. the CPU requires almost as much voltage/amperage while running NOP''s as when running code. I''d be more worried about those that leave their computers on all night while running a 3d screen saver, as that wastes a LOT more energy than a bit of extra code.
I''ve never been able to get a straight answer on this, but just how much does it cost to run your computer all the time? My parents are really anal about leaving it on if one isn''t using it due to it''s power usage, but just about everyone I know leaves their computers on 24/7 and they don''t complain about high power bills.

Just how much power is consumed compared to something like a television, in relative terms or raw kilowatts?
i think the monitor uses a lot more power than the ''box'', but it might not be true anymore (this was a 33 mhz computer)
quote:Original post by Zipster
Just how much power is consumed compared to something like a television, in relative terms or raw kilowatts?


it should be printed somewhere on the power supply of your PC
it''s probably 200-300 W
i think you''ll find information about power consumption on nearly all electrical devices.
Visit our homepage: www.rarebyte.de.stGA
quote:Original post by Zipster
I''ve never been able to get a straight answer on this, but just how much does it cost to run your computer all the time? My parents are really anal about leaving it on if one isn''t using it due to it''s power usage, but just about everyone I know leaves their computers on 24/7 and they don''t complain about high power bills.

Just how much power is consumed compared to something like a television, in relative terms or raw kilowatts?


It''s a fairly significant amount. Your power supply rating will give you a good estimate of the upper bound of how many watts your system is using.

But, if you use your computer quite a bit, then turning it off and on frequently may consume MORE power. Not only in wasted time during boot, but the spike of power used when you first flip the power on. Same reason lights burn out faster if you flip them on and off quickly.

I too would like some real numbers on the subject. Of course, my CPU usage is always near 100%, so I''m not really "wasting" power. Distributed computing is great, whether you are folding proteins, looking for aliens, or finding new primes.

I wish I could get a count of how many NOPs my PC executes per day.

--TheMuuj
--TheMuuj
quote:Original post by ga
it should be printed somewhere on the power supply of your PC
it's probably 200-300 W
i think you'll find information about power consumption on nearly all electrical devices.


Just because the power supply supports that many watts, doesn't mean it constantly uses it. While your CD-Rom isn't spinning, hard drive is inactive, cpu is doing NOP's, video card is not using the 3d section, etc it won't even use 1/2 of that. The cpu is rated somewhere around 60 watts on average (can be less, can be more, but just an average), so if that's the only thing doing anything, you're not using the full 300-350 listed on your power supply .


--- Edit ---
Stupid double qouting doesn't work properly.

[edited by - Ready4Dis on May 2, 2003 10:35:07 PM]
quote:Original post by Thunder_Hawk
... code bloat actually saves more energy than it wastes, but I could be wrong. To actually determine this would require a lot of energy in itself.


Energy is important. Writing bloated code saves programmer energy but not PC''s.

Im glad that I''m not the only one who uses his pc for 24/7 especially when I got 24/7 internet connection ...

Processor can consume quite a bit of energy but so does many other devices. I believe this amount is bigger than TV but both runs at same time anyway

But it shouldn''t be very horrible amount.
Please AGREE or I will LIBERATE you

This topic is closed to new replies.

Advertisement