How to prevent game from taking 99% CPU?

Started by
14 comments, last by joanusdmentia 17 years, 8 months ago
Hello! Just a basic question, how do I prevent my game from taking 99% CPU while running the game WHILE loop. I tried SDL_Delay(); anywhere from 1-5 milliseconds, it works but then I dont get the smooth feeling (less FPS!) at all :( Please help :)
Windows - Dev-Cpp 4.9.9 - SDL - SDL_Net
Advertisement
Sure your game uses 100% CPU, because you ask it to generate as much frames per seconds as possible. That is no problem, since the game will probably the only thing that the user runs anyway, and he wants it to run as smooth as possible.
As DaBono pointed out, you can't generate as many frames as possible and at the same time not allow the application to use as much resources as possible.
When you Sleep, or Delay as SDL calls it, you usually have a target update rate you wish to limit your application to. At the end of each frame you check to see how long the frame took, if it took less time than your target then you sleep the remaining time.
Best regards, Omid
Most monitors use the 60 hz of refrash rate (in windows, you can see it in control panel -> Display -> Display -> Settings -> Advanced). So it doesn´t matter if your application is able to show more than 60 hz frames per second, your screen refresh rate is updated just in 60 hz. So, if want to top limit your application refresh to 60 hz, you have to measure the time spent in the main loop if it is less than 1/60 seconds you sleep the rest.
Perhaps I am getting confused here, but surely if I wrote two programs that both ran infinite while loops without sleeping, then I ran them both at the same time, wouldn't Windows decide how to portion up CPU time to each for me?

I thought Sleep() was just a way to offer CPU time to Windows. It'll take it anyway if it wants it, surely?
Quote:Original post by EasilyConfused
Perhaps I am getting confused here, but surely if I wrote two programs that both ran infinite while loops without sleeping, then I ran them both at the same time, wouldn't Windows decide how to portion up CPU time to each for me?

I thought Sleep() was just a way to offer CPU time to Windows. It'll take it anyway if it wants it, surely?


That's correct. Sleep just lets the OS know the app doesn't need the CPU time, but not sleeping doesn't mean the OS wont take it anyway.
"Voilà! In view, a humble vaudevillian veteran, cast vicariously as both victim and villain by the vicissitudes of Fate. This visage, no mere veneer of vanity, is a vestige of the vox populi, now vacant, vanished. However, this valorous visitation of a bygone vexation stands vivified, and has vowed to vanquish these venal and virulent vermin vanguarding vice and vouchsafing the violently vicious and voracious violation of volition. The only verdict is vengeance; a vendetta held as a votive, not in vain, for the value and veracity of such shall one day vindicate the vigilant and the virtuous. Verily, this vichyssoise of verbiage veers most verbose, so let me simply add that it's my very good honor to meet you and you may call me V.".....V
You shouldn't say that most monitors use 60 mHz vertical refresh rate. Many use 60,70, 72, 75, 85, etc. You would need to query the os to what refresh rate is being used first.

"I can't believe I'm defending logic to a turing machine." - Kent Woolworth [Other Space]

The desired refresh needed to prevent flicker increases as screen size increases. It orignially deemed to be 60 Hz at 14", at 19" it is > 65 Hz.

This is compounded by the fact that AC outlets and flouresent lights operate at 60 Hz (in the US) so any monitor at 60 Hz is going to have flicker problems detectable to a certain percentage of the populous. The recommended refresh rates are approx. 75 Hz at < 21" and 85 Hz at > 21".

This is distinct from the idea of the human eye creating the facade of "continous motion" at a given framerate. Movies orignal thought that 18 was enough, but a large percentage of do not convert 18 frame into continous motion, so they changed to 24. Video people often use 30, for conversion reasons, but there has never been show any study suggesting the 24 frame movie is not close enough to ideal to be acceptable as perfect for 99% of the audience.

"twitch" games on the other hand want higher frame rates not to preserve the illusion of continuity, but to be information to the user with the lowest possible latency. Because latency is additive - the latency of you network, the game engine frame, the rendering, and the waiting for retrace all add up ... so if you want shooting to be accurate to a certain degree (such as 1/20th sec) you need latency below that, and as a general rule of thumb, twice as many frames per second to over the desired delay as a maximum instead of an average (in which case the maximum would be double the desired rate).

Personally I feel that a single player game maintaining 24-30 FPS is near perfect (if implemented right, some game have an extra 1 frame latency for using input, which feels weird). For multiplayer game I notice my accuracy of control and respone increases up to about 45 FPS when using my monitor at 85 Hz, and all the way up to 60 FPS when using 60 Hz (because missing a frame at 60 Hz effectivly means you are at 30 FPS for that frame).
Quote:Original post by Rattrap
You would need to query the os to what refresh rate is being used first.


How do you do that under Windows?
Quote:Original post by Lukewarm
Quote:Original post by Rattrap
You would need to query the os to what refresh rate is being used first.


How do you do that under Windows?

Using GetDeviceCaps(hdc, VREFRESH) or EnumDisplaySettings, I'm not sure which.

This topic is closed to new replies.

Advertisement