cores and energy

Started by
17 comments, last by alh420 10 years, 7 months ago

As far as I know those days 8-core amd procesors are avaliable on the market (not sure i I am not mistaken somethink I have weak orientation in hardware) Would like to buy one better than 4-core, why not, but the thing that worries me is energy. Does not 8 core cpu uses near 8 times more energy than one core? does it will multiply with the yet increasing core sizes? If I read opinions that after a couple of years we will have 60- core procesors or some years l8er 1000-core systems I doubt it because of such energy question - w ho would like to run 10kiloWat machine at home :/

(not me ;/ i would like to have at most 200W system, the better yet 100W)

Advertisement

(blank)

If your only goal is to get a processor with low power usage, there is such a thing for that.

Intel's latest chipset, the Haswell family, includes low-watt processors designed for tablets and other fan-less portable computers. It has 28W and 15W varieites, and even a 4.5W 2-core processing-limited chip.

If you are going for the high-end 4770K chip, the max power is 84W. If you think about the number-crunching abilities it has when running at full power and compare it to the history of computing, that power usage is pretty low.

On most game-related computers it is the video card that sucks down power. The GTX690 draws up to 300W, the GTX590 used up to 365W.

If your only goal is to get a processor with low power usage, there is such a thing for that.

Intel's latest chipset, the Haswell family, includes low-watt processors designed for tablets and other fan-less portable computers. It has 28W and 15W varieites, and even a 4.5W 2-core processing-limited chip.

If you are going for the high-end 4770K chip, the max power is 84W. If you think about the number-crunching abilities it has when running at full power and compare it to the history of computing, that power usage is pretty low.

On most game-related computers it is the video card that sucks down power. The GTX690 draws up to 300W, the GTX590 used up to 365W.

are multicore processor times more consuming in energy or can this be reduced magically, and how? this is importnt question to know

Are you trying to choose the CPU for laptop? Because otherwise 50W difference between AMD and Intel CPU is just irrelevant.

are multicore processor times more consuming in energy or can this be reduced magically, and how? this is importnt question to know


They use dynamic frequency scaling and other techniques so they don't run at full speed all the time and so save energy/generate less heat/etc. The only time they run at full power is if you give them lot of stuff to do.

are multicore processor times more consuming in energy or can this be reduced magically, and how? this is importnt question to know


Power consumption does not necessarily scale linearly with the number of cores. For instance there are many parts that are shared between cores. Also, unused cores will consume much less power, like a single core CPU will consume less power if it does nothing.

Your best bet is to look at the specs of the specific CPUs you have in mind. It should least peak, minimum and average power consumption.
The fabrication process affects the size of the components (transistors), which affects the power requirements.

If you simply copied an old core design, fabricated in the same way but with 8 of them next to each other, then yes, power usage might go up by ~8x (disregarding variable frequencies, shared components, etc).
But if you redesign the chip with components that are 1/8th the size that they used to be, then power usage might even go down ;)

Circuits on VLSI have two main sources of power consumption: dynamic and static power dissipation. Dynamic power has already largely been described in this topic. It is the result of the circuit toggling in response to demand from "code" that you're running. The main formula that governs this power consumption is: P = 0.5 CV2f

C = total switched capacitance

V = voltage

f = frequency

As you can see power consumption is most affected by voltage as it increases with its square. The singular most effective way to control dynamic power is to reduce voltage. But when you reduce voltage the operating circuit will no longer be able to perform at its maximum frequency, thus frequency must be reduced. Someone already mentioned it earler, but this is how dynamic frequency scaling works. A table is created with voltage indexes into frequency. The underlying OS will request a frequency and/or voltage and the hardware will automatically adjust.

The other component to power consumption is static power dissipation, otherwise known as leakage current. Leakage current is undesirable current that performs no useful function (typically although research is being done into operating at ultra-low frequencies with leakage power). Even if you turn off an individual transistor it still is conducting current. In older technologies (> 65nm feature size) this power consumption was negligible. In modern devices the total power consumed approach 30-50% of the total power used!

Leakage power manifests from fundamental quantum theory. Electrons literally teleport across an insulating barrier in an effect known as tunneling. More tunneling is done the smaller devices become.

There's more to learn if you're interested. For "plug-into-the-wall" PCs / servers power consumption is less of a concern. Although end-users (server farms, etc.) are looking to reduce their power footprint because it is becoming a major operational cost. For handhelds intelligent circuit design is used to control voltage and frequency: dynamic frequency scaling, automatic voltage scaling, temperature based clocked throttling (so your IC doesn't melt), instruction based clock throttling (high power assembly code will trigger power dampening), et. al.

I keep getting the feeling that aside from standard benchmarks which show frequency and voltage, there should also be requirements for temperature.


If you simply copied an old core design, fabricated in the same way but with 8 of them next to each other, then yes, power usage might go up by ~8x (disregarding variable frequencies, shared components, etc).
But if you redesign the chip with components that are 1/8th the size that they used to be, then power usage might even go down ;)
Let's put it in real-world data. It's easy to do that with AMD chips since their architecture basically stayed the same for years!
  • AMD Athlon 3000+: clocked at 2400Mhz (core Venice), 90nm. Launched april 2005. 89W
  • AMD Athlon ][ 450: 3 cores at 3200Mhz (roughtly equivalent to 4Ghz Venice), 45nm SOI. Launched October 2009. 95W

Previously "Krohm"

This topic is closed to new replies.

Advertisement