Wasting energy with excess computer code

Started by
36 comments, last by Barn Door 20 years, 11 months ago
eh, this is just the breeder solar system anyway ... just like a baby''s crib, when (if) we develop into a space fairing race, we''ll just chew up the asteroids and resources of this solar system in a grand expansion project anyway. So why worry about. Either we get off this rock, and die off some day many thousands of years into the future, or we stay on this rock, and die a little sooner.

In the meantime, we''ll create and play a million "games" simulating exactly these situations.

More computers == more power == faster burn rate == greater need to expand.

More computing power + greater need expand --> greater chance of space fairing developments.
Advertisement
"If you completely over use the c preprocessor to transfer C into your own custom language this can help cut the power consumption of your PC by almost 40% when running your programs."

You might be right about GOTO, but I have to disagree with you here.
I''m pretty sure all the preprocessor can do is convert "your custom language"<-???macros?? into regular code, which really gives you nothing for a performance _gain_. But it can sure make code look cleaner..
this place is lame
quote:Original post by Thunder_Hawk
I''m giving you an award for the perfect title. I was able to get your exact worry when reading it. Considering the number of computers out there, it seems that this has to be a significant problem. However, consider the amount of energy wasted either programing more efficient programs or doing it manually. Humans require quite a bit of energy themselves. I think that this view should reveal that code bloat actually saves more energy than it wastes, but I could be wrong. To actually determine this would require a lot of energy in itself.

[edited by - Thunder_Hawk on May 2, 2003 8:06:45 PM]



I concur.


-~-The Cow of Darkness-~-
-~-The Cow of Darkness-~-
quote:
Bloated software is the number one environmental problem of this century.


Here, here.

[edited by - barn door on May 5, 2003 3:11:14 PM]
"Bloated software is the number one environmental problem of this century."

WTF IS THAT ?????

Who cares is it bloated or not. Bloated software is future of sotftware anyway ;p

Please AGREE or I will LIBERATE you
BTW, windows excecutes halts turning off the CPU when it''s not doing anything, not NOPs.
quote:Original post by The Heretic
BTW, windows excecutes halts turning off the CPU when it''s not doing anything, not NOPs.


Windows NT uses HLTs. Or at least everyone says it does...I have seen no proof. I don''t know how to find out for sure, other than run NT on some sort of emulator. For all I know it''s executing XOR AX, 0.

Windows 95 uses NOPs, which is why there were programs written (like RAIN or CPU-IDLE) that would execute HLTs. I''ve heard that Windows 98 can use either, but I''m pretty sure NOP was the default.

While the HLT instruction may turn off parts of the CPU, this is not a guarantee. I''ve seen reports that some desktop processors implement the HLT in a manner similar to NOP. Once again, I have no real evidence, and welcome anyone who does to step forward.

Seeing as I see almost no temperature difference on my PC when running idle or running 100%, I remain skeptical. I do notice a difference when I fire up a 3-D game, but the video card is so close to the processor, I am not surprised.

--TheMuuj
--TheMuuj
actually, when you run a good 3D game, both the CPU and the video card use more power ... the video cards are made to leave unneeded texel engines and such in standby while not in use ... in the same way, the MMX, SSE, SSE2 and MAYBE even the x97 sections of a CPU are not running at full power when not activated (although I bet x87 is always activated on current CPUs).

This topic is closed to new replies.

Advertisement