• Advertisement
Sign in to follow this  

Game message loop - using 50% of CPU when window is minimized or inactive

This topic is 1646 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey,

This is how my game loop looks like:

MSG msg;
ZeroMemory(&msg, sizeof(msg));

while(msg.message != WM_QUIT)
{
    if(PeekMessage(&msg, NULL, 0U, 0U, PM_REMOVE))
    {
        TranslateMessage(&msg);
        DispatchMessage(&msg);
    }
    else
    {
        if( !Render() ) //In Render() function there are all game drawings and calculations 
        {
            MessageBox(NULL, "Failed to render", "Error", MB_OK | MB_ICONEXCLAMATION);
            PostQuitMessage(0);
        }

    }
}

I'm not actually sure what happens in PeekMessage(...) function. When the game window is active and has focus, everything seems to work fine. PeekMessage function seems to distribute CPU to my game in accordance with its needs. More CPU when there's collision, less when nothing happens at all. I experienced however huge growth of CPU when I minimize my window. I probably have dual core processor and thats why it's taking 50% CPU in that time.
 When I erase this block of code from my game loop:

if(PeekMessage(&msg, NULL, 0U, 0U, PM_REMOVE))
    {
        TranslateMessage(&msg);
        DispatchMessage(&msg);
    }

it takes 50% of CPU all the time -> that's why I assumed that PeekMessage distributes somehow CPU. Simple infinite loop like this:

while(1)
{}

will also take 50% of CPU.

 

Why is that happening and how can I prevent this growth of CPU without sleep()? I would be very grateful for help!

Share this post


Link to post
Share on other sites
Advertisement

PeekMessage just checks if there's a Windows message your app needs to process. It doesn't pause the app or anything. That's what Peek implies. It says "Just check and return true if there's a message waiting and false if there isn't, but be sure to return immediately; don't wait for a message to come if there isn't one."

 

Programs on the computer run as fast as they can. If your program is a single threaded program and you have a dual core computer, then yes, it will only take 50% when running at full speed (100% of one core).

 

Hodgman lists some good options here. Another option is to enter a different game loop when your game is minimized that uses WaitMessage() (and then switch back to using PeekMessage when your game is no longer minimized).

Share this post


Link to post
Share on other sites

Thanks for the answer. The problem is also that my game has multiplayer mode so I can't stall my application  becouse client will timeout and disconnect from the server. Even if you say that programs run as fast as they can I still can't understand why my game doesn't use 50% of CPU all the time then? PeekMessage must be doing sth which prevents it from using max of CPU I guess.

Share this post


Link to post
Share on other sites

Maybe your graphics card isn't able to accept more commands and your CPU is waiting for the GPU to finish?

 

Cheers!

Share this post


Link to post
Share on other sites

Since the rendering is probably skipping a lot of its actual work when the window is minimized, you're ending up with a very tight loop that's essentially calling PeekMessage() endlessly. You may want to place a short loop in there somewhere that calls SwitchToThread() until a certain amount of time has elapsed before processing the next frame.

Share this post


Link to post
Share on other sites

It's possible you have vsync enabled and your program is capped at rendering at ~60fps. Since your update rate is directly dependent on your render rate (just by looking at your loop), you would have less CPU usage because your program would be stalling, waiting for the video card to allow it to draw due to vsync.

 

When you minimize, it's possible it's no longer rendering, thus vsync no longer limits your program to 60 fps, thus maxing out the CPU.

 

Just a possibility.

 

PeekMessage() doesn't save any CPU, though. That's for sure.

Share this post


Link to post
Share on other sites

It would be because it's a single threaded program and it max's out one core. if you have a dual core that's 50% usage on your CPU or one core with HT. That's the usual reason.

Share this post


Link to post
Share on other sites

Thanks for answers! I think it's true that it was due to my lack of constant framerate. After adding some framerate the program no longer consumes max of 1 core's processor

Share this post


Link to post
Share on other sites

You should empty the message queue every frame, not one message per frame.

MSG msg;

	while(PeekMessage(&msg, NULL, 0, 0, PM_REMOVE))
	{
		if(msg.message == WM_QUIT)
			return false;

		TranslateMessage(&msg);
		DispatchMessage(&msg);
	}
if(g_Minimized)
Sleep(1);
 } else{
RenderFrame();

}

You can tell if window is minimized

switch(message)
{
case WM_SYSCOMMAND:
		switch (wParam)
		{          
			case SC_MINIMIZE:
				g_Minimized = true;
				return 0;
				break;
			default:
				return DefWindowProc(hWnd, message, wParam, lParam);
				break;
		}
}

Share this post


Link to post
Share on other sites

You'll want to watch for it to be restored as well when g_Minimized is true. (SC_MAXIMIZE or SC_RESTORE)

Edited by Khatharr

Share this post


Link to post
Share on other sites


You should empty the message queue every frame, not one message per frame.

 

The original code is actually fine. If there is a message, no frame is rendered. Rendering only occurs when all messages have been processed.

Share this post


Link to post
Share on other sites

 


You should empty the message queue every frame, not one message per frame.

 

The original code is actually fine. If there is a message, no frame is rendered. Rendering only occurs when all messages have been processed.

 

 Yes, I noticed that now. However, I think that code looks tangled.

Share this post


Link to post
Share on other sites

Thanks for the answer. The problem is also that my game has multiplayer mode so I can't stall my application  becouse client will timeout and disconnect from the server. Even if you say that programs run as fast as they can I still can't understand why my game doesn't use 50% of CPU all the time then? PeekMessage must be doing sth which prevents it from using max of CPU I guess.

 

The utilization will drop when something you call "blocks". PeekMessage is a non-blocking call. Something else is causing the slow down.

When an OS call blocks, it will task swap to other programs while it waits for your blocking call to finish. If nothing needs to run it goes to idle and your utilization will drop.

 

Since your CPU utilization goes *up* when you minimize this strongly suggest to me that you are making a blocking call in your graphics code.

Once you minimize the context or rendering device is malfunctioning, the graphics routines are returning an error (instead of blocking) so your game loop runs flat out.

Edited by Shannon Barber

Share this post


Link to post
Share on other sites

 

Thanks for the answer. The problem is also that my game has multiplayer mode so I can't stall my application  becouse client will timeout and disconnect from the server. Even if you say that programs run as fast as they can I still can't understand why my game doesn't use 50% of CPU all the time then? PeekMessage must be doing sth which prevents it from using max of CPU I guess.

 

The utilization will drop when something you call "blocks". PeekMessage is a non-blocking call. Something else is causing the slow down.

When an OS call blocks, it will task swap to other programs while it waits for your blocking call to finish. If nothing needs to run it goes to idle and your utilization will drop.

 

Since your CPU utilization goes *up* when you minimize this strongly suggest to me that you are making a blocking call in your graphics code.

Once you minimize the context or rendering device is malfunctioning, the graphics routines are returning an error (instead of blocking) so your game loop runs flat out.

 

 

I'm guessing its a v-synced bufferswap that blocks when its not minimized.

Share this post


Link to post
Share on other sites

To directly check if your window is minimized you can also use the API directly, IsIconic. The name stems back from the time when minimized windows were shown as icons :)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement