Circuits on VLSI have two main sources of power consumption: dynamic and static power dissipation. Dynamic power has already largely been described in this topic. It is the result of the circuit toggling in response to demand from "code" that you're running. The main formula that governs this power consumption is: P = 0.5 CV2f
C = total switched capacitance
V = voltage
f = frequency
As you can see power consumption is most affected by voltage as it increases with its square. The singular most effective way to control dynamic power is to reduce voltage. But when you reduce voltage the operating circuit will no longer be able to perform at its maximum frequency, thus frequency must be reduced. Someone already mentioned it earler, but this is how dynamic frequency scaling works. A table is created with voltage indexes into frequency. The underlying OS will request a frequency and/or voltage and the hardware will automatically adjust.
The other component to power consumption is static power dissipation, otherwise known as leakage current. Leakage current is undesirable current that performs no useful function (typically although research is being done into operating at ultra-low frequencies with leakage power). Even if you turn off an individual transistor it still is conducting current. In older technologies (> 65nm feature size) this power consumption was negligible. In modern devices the total power consumed approach 30-50% of the total power used!
Leakage power manifests from fundamental quantum theory. Electrons literally teleport across an insulating barrier in an effect known as tunneling. More tunneling is done the smaller devices become.
There's more to learn if you're interested. For "plug-into-the-wall" PCs / servers power consumption is less of a concern. Although end-users (server farms, etc.) are looking to reduce their power footprint because it is becoming a major operational cost. For handhelds intelligent circuit design is used to control voltage and frequency: dynamic frequency scaling, automatic voltage scaling, temperature based clocked throttling (so your IC doesn't melt), instruction based clock throttling (high power assembly code will trigger power dampening), et. al.
I keep getting the feeling that aside from standard benchmarks which show frequency and voltage, there should also be requirements for temperature.