You should empty the message queue every frame, not one message per frame.
The original code is actually fine. If there is a message, no frame is rendered. Rendering only occurs when all messages have been processed.
You should empty the message queue every frame, not one message per frame.
The original code is actually fine. If there is a message, no frame is rendered. Rendering only occurs when all messages have been processed.
You should empty the message queue every frame, not one message per frame.
The original code is actually fine. If there is a message, no frame is rendered. Rendering only occurs when all messages have been processed.
Yes, I noticed that now. However, I think that code looks tangled.
Thanks for the answer. The problem is also that my game has multiplayer mode so I can't stall my application becouse client will timeout and disconnect from the server. Even if you say that programs run as fast as they can I still can't understand why my game doesn't use 50% of CPU all the time then? PeekMessage must be doing sth which prevents it from using max of CPU I guess.
The utilization will drop when something you call "blocks". PeekMessage is a non-blocking call. Something else is causing the slow down.
When an OS call blocks, it will task swap to other programs while it waits for your blocking call to finish. If nothing needs to run it goes to idle and your utilization will drop.
Since your CPU utilization goes *up* when you minimize this strongly suggest to me that you are making a blocking call in your graphics code.
Once you minimize the context or rendering device is malfunctioning, the graphics routines are returning an error (instead of blocking) so your game loop runs flat out.
Thanks for the answer. The problem is also that my game has multiplayer mode so I can't stall my application becouse client will timeout and disconnect from the server. Even if you say that programs run as fast as they can I still can't understand why my game doesn't use 50% of CPU all the time then? PeekMessage must be doing sth which prevents it from using max of CPU I guess.
The utilization will drop when something you call "blocks". PeekMessage is a non-blocking call. Something else is causing the slow down.
When an OS call blocks, it will task swap to other programs while it waits for your blocking call to finish. If nothing needs to run it goes to idle and your utilization will drop.
Since your CPU utilization goes *up* when you minimize this strongly suggest to me that you are making a blocking call in your graphics code.
Once you minimize the context or rendering device is malfunctioning, the graphics routines are returning an error (instead of blocking) so your game loop runs flat out.
I'm guessing its a v-synced bufferswap that blocks when its not minimized.
To directly check if your window is minimized you can also use the API directly, IsIconic. The name stems back from the time when minimized windows were shown as icons :)