Jump to content

  • Log In with Google      Sign In   
  • Create Account


Why do graphic intensive game use a lot of energy

  • You cannot reply to this topic
9 replies to this topic

#1 warnexus   Prime Members   -  Reputation: 1381

Like
1Likes
Like

Posted 31 May 2014 - 09:00 AM

I know that the GPU and CPU plays a role. Is it necessarily because CPU needs to execute a lot of cycles per instruction(CPI)?

Does pipelining the instruction consumes less battery then?

I'm not sure if I am making any sense on this but I thought at least come up with an educated guess just to strike a discussion.

Sponsor:

#2 Juliean   GDNet+   -  Reputation: 2229

Like
0Likes
Like

Posted 31 May 2014 - 10:06 AM

The more you calculate, the more energy you use. If you run some ultra-low-end 2d game without capping the framerate, say at 5000 FPS, you will use jus as much energy. The only difference between a graphical intensiv game and a simpler one is when you e.g. enable VSYNC. For the game that could run at 5000 FPS, at 60 FPS CPU/GPU can idle most of the time. For the high-end game that barely runs at 60 FPS even without VSYNC, both have to work full time.



#3 Shane C   Crossbones+   -  Reputation: 1103

Like
1Likes
Like

Posted 31 May 2014 - 01:12 PM

My laptop GPU uses significantly more battery life than my laptop CPU. So whenever a game turns on, I expect battery life to plummet.

 

Think of the GPU as a more power-drawing processor than the CPU that turns off when no game or 3D eyecandy is displayed, turns on when it does.



#4 Andy Gainey   Members   -  Reputation: 1806

Like
1Likes
Like

Posted 31 May 2014 - 01:26 PM

Another contributing factor is that modern CPUs are typically data starved, because the speed of memory hasn't kept up with the speed of the processor over the years.  So every time there is a cache miss, a large fraction of the transistors in your CPU are going to be sitting there idle, while the data is fetched from main memory (or even just from a level 2 or 3 cache).  I don't know about other OSes, but Windows still reports this as 100% CPU utilization, even though it is very far from 100% transistor utilization.

 

GPUs on the other hand organize data and operations much more predictably and efficiently, and can thus keep a far larger portion of its transistors operating consistently.



"We should have a great fewer disputes in the world if words were taken for what they are, the signs of our ideas only, and not for things themselves." - John Locke

#5 GuardianX   Crossbones+   -  Reputation: 1484

Like
0Likes
Like

Posted 31 May 2014 - 01:29 PM

Pipelining the instructions increases the speed of your application, which ends up consuming less battery than the application, which executes longer. Maybe I didn't get your question right =)



#6 Promit   Moderators   -  Reputation: 6096

Like
5Likes
Like

Posted 31 May 2014 - 02:17 PM

Allow me to add slightly more technical detail here. Each transistor on a processor consumes power while it's active -- a certain amount while open, less while closed. There's also energy consumed in switching, but even when not in use they are draining some power. Most chips, whether CPUs or GPUs, are divided into large groups of transistors or higher level units. These units can be powered off independently, allowing the chip to drastically cut back on its power use when mostly idle. So when you're doing light computing tasks, both the CPU and GPU shut off most of the chip and run in minimal modes. Historically GPUs also integrated a very low power 2D mode that was used for desktop interface stuff. Since user interfaces are now mostly 3D rendered, that particular setting is no longer present. Pretty much all modern CPU and GPU chips are running at flexible clock speeds and voltages as well. When the system reduces clock speeds, it's able to reduce supply voltages as well. This cuts back on internal current leakage and overall power consumption quite a bit.

 

When you fire up a graphics intensive game, it forces clocks to maximum and all internal processing units to fully active. The memory chips will begin switching at high speed, so that will pull more power. Heat increases, which will also trigger at least one internal fan. Depending on the game, the hard drive on optical drive may be brought up to speed too, which will also increase power usage. Even bringing the network interfaces to full speed will increase power usage.


Edited by Promit, 31 May 2014 - 02:18 PM.


#7 Chris_F   Members   -  Reputation: 1938

Like
2Likes
Like

Posted 31 May 2014 - 02:59 PM


Each transistor on a processor consumes power while it's active -- a certain amount while open, less while closed. There's also energy consumed in switching, but even when not in use they are draining some power.

 

Are you sure about that? I though transistors in a CMOS configuration use negligible energy when in a stable state.


Edited by Chris_F, 31 May 2014 - 03:00 PM.


#8 Shane C   Crossbones+   -  Reputation: 1103

Like
0Likes
Like

Posted 31 May 2014 - 03:38 PM

Basically, tablets/laptops/etc are built to have fantastic battery life in things like web browsing, but not in things like games...

 

One example of this is that I believe the Tegra 4 has an extra processor it uses when idle that is different than the quad-core, and it basically shuts off the main quad-core. This could probably almost never happen during a game.

 

Let's say a tablet has a 10 hour battery life during web-browsing and idle use...

 

It might be 7.5 hours on a CPU intensive program.

 

It might be 4.5 hours in a CPU/GPU intensive program like a game.

 

The reality is, games are hard to engineer battery life toward, so they get the short end of the stick.


Edited by Shane C, 31 May 2014 - 03:38 PM.


#9 Madhed   Crossbones+   -  Reputation: 2492

Like
1Likes
Like

Posted 31 May 2014 - 05:24 PM

 


Each transistor on a processor consumes power while it's active -- a certain amount while open, less while closed. There's also energy consumed in switching, but even when not in use they are draining some power.

 

Are you sure about that? I though transistors in a CMOS configuration use negligible energy when in a stable state.

 

 

Correct, most power is dissipated during switching. The negligible current while in a stable state is actually the reason CMOS is used.



#10 Promit   Moderators   -  Reputation: 6096

Like
0Likes
Like

Posted 31 May 2014 - 05:45 PM

 

 


Each transistor on a processor consumes power while it's active -- a certain amount while open, less while closed. There's also energy consumed in switching, but even when not in use they are draining some power.

 

Are you sure about that? I though transistors in a CMOS configuration use negligible energy when in a stable state.

 

 

Correct, most power is dissipated during switching. The negligible current while in a stable state is actually the reason CMOS is used.

 

Negligible times a billion turns out to be not all that neglible. I'm not sure how much power is lost to internal current leakage, though.







PARTNERS