How to limit FPS (without vsync)?

Started by
13 comments, last by Adehm 9 years, 1 month ago

Yo, I'm having a bit of trouble trying to figure out how to limit my fps in my game engine. The engine runs at about 1000 fps, and I can already hear my graphics card squealing tongue.png. This obviously isn't good, and won't be good for anyone else playing a game made with my engine.

I wan't to know how to cap the framerate of the rendering part of the game engine, so that way the game renders far less often.

I am using a variable time step like so:


void GameSystem::GameLoop()
{
	previousTime = glfwGetTime();

	while(!glfwWindowShouldClose(glfwWindow))
	{
		currentTime = glfwGetTime();
		deltaTime = (currentTime - previousTime);
		previousTime = currentTime;

		profiler->TimeStampStartAccumulate(PROFILER_TIMESTAMP_FPS_UNLIMITED);

		InputMouseMovementProcessing();
		UpdateGameEntities();

		renderingEngine->StartRender();

		glfwPollEvents();

		profiler->TimeStampEndAccumulate(PROFILER_TIMESTAMP_FPS_UNLIMITED);
		profiler->EndOfFrame();
	}
	delete profiler;
	//CleanUpGameSystem();

	glfwTerminate();
}

So it's a pretty basic gameloop.

Things i've tried that never worked:

-Sleep(1). Unfortunately, this always takes away around 1 ms to about more than 16.66666 (60 fps), Definitely NOT

acceptable. Sleeping isn't an option.

-A spinloop. Looping through a while loop and accumulating how much time elapsed until 16.66666 ms is reached, then render. This sucks because it isn't even accurate, since you have to perform operations to accumulate the time elapsed, which pollutes the time accumulations.

-Fixing a timestep for Rendering:


while(deltaTimeAccumulated > 16.66666)
{
    if(deltaTimeAccumulated > 200.0)
    {
        //Break if entering spiral of death
        deltaTimeAccumulated = 0;
        break;
    }
    renderingEngine->StartRender();
    deltaTimeAccumulated -= 16.66666;
}

This does work to some extent, except that I encounter stuttering every half second or so. This is caused by frame skipping every time the number of frames per render exceeds the average number of frames per render (imagine looping 5 times per render, 300 times in a row. Update 4 times per render, only once, and you will notice a stutter). I cannot find any way to fix this, because there is no guarantee that the number of frames per render will be the same, since elapsed time will always vary :/.

I am using GLFWGetTime(); which is high precision up to doubles, so that's not the problem.

~~~

The only solution I could come up with (a rather shoddy one) is to cheat and check if the framerate is dangerously high (like 500 or so), and just run some expensive function just to pool more time to the cpu instead of the gpu.

I'm sure i'm not the only one who has encountered this issue. Does anyone have any ideas?

View my game dev blog here!

Advertisement

Whats the obsession with so many people not wanting v-sync? Its automagically doing what you want, preventing the gpu doing useless frames you only see a fraction of, without any tearing. I would just use that glfwSwapInterval call.

If you actually turned it on and it does not work check your driver settings to allow vsync.

If you measure that its not working when starting your game, then you might warn about it and are stuck with sleep. But dont just always do sleep(1); measure the time, check if its necessary and use it for the call, after calling timeBeginPeriod (which is usually not a recommended call as its a global setting that wastes energy and only justified when really needing these short sleeps which still would result in an overall saving). Dont forget timeEndPeriod if your game is paused.

https://randomascii.wordpress.com/2013/07/08/windows-timer-resolution-megawatts-wasted/

Whats the obsession with so many people not wanting v-sync?

I fear the input lag, especially for the kind of game I am planning on making, which might end up being a bullet hell.

But dont just always do sleep(1); measure the time, check if its necessary and use it for the call, after calling timeBeginPeriod (which is usually not a recommended call as its a global setting that wastes energy and only justified when really needing these short sleeps which still would result in an overall saving). Dont forget timeEndPeriod if your game is paused.

I never knew you could adjust the precision for sleep. That changes everything :o. I guess sleep isn't entirely worthless for games then. It does seem to have a few nasty drawbacks though. For laptops I guess vsync is a must if there's no other option for a method that will save power.

View my game dev blog here!

Just some remarks:
- input lag is managable, why else do aaa games use vsync and can still be playable?

Before the "how" question, I would try to make sure "why". You're GPU/CPU will manage fine with 1000fps. When you're going further in development of your game, you'll decrease those 1000 anyway :)

Crealysm game & engine development: http://www.crealysm.com

Looking for a passionate, disciplined and structured producer? PM me

Just some remarks:
- input lag is managable, why else do aaa games use vsync and can still be playable?

Before the "how" question, I would try to make sure "why". You're GPU/CPU will manage fine with 1000fps. When you're going further in development of your game, you'll decrease those 1000 anyway smile.png

Only really terrible AAA games don't even give you the option to disable vsync. An AAA game also would be much less likely to run at something crazy like 1k FPS due to the amount of drawing and cpu time involved.

Vsync always has the input lag issue, the only thing you really gain out of is the removal of tearing, in some games the tearing is way more obvious than in others. For instance if anyone has tried Legend of Grimrock 2, the tearing becomes incredibly bad in a game like that due to the on rails movement speed.

That said, unfortunately there aren't any really reliable ways to cap FPS besides sleeping or something silly like busy waiting, you're kind of at the mercy of the OS scheduler.

You could minimize it by setting a 80/20 goal for:

- average system specs, cpu/gpu
- aimed draw time per frame, and 'use it up' with features

Crealysm game & engine development: http://www.crealysm.com

Looking for a passionate, disciplined and structured producer? PM me

I fear the input lag, especially for the kind of game I am planning on making, which might end up being a bullet hell.


Decouple your rendering thread(s) from your input and simulation updates. Bam, no lag. You can run your input thread at 100,000 hz if you want (which you don't, because that'd be silly) and your simulation can run at 240 hz if you want (which you might, maybe, not that you'll be writing any interesting game that's actually capable of running that fast) and still keep rendering with vsync enabled at 60 hz, then.

Manually forcing the framerate to anything is obviously going to have _the exact same effect on input_ unless you move input off into a thread, anyway.

Sean Middleditch – Game Systems Engineer – Join my team!

This previous thread on the subject may be useful: http://www.gamedev.net/topic/665939-how-to-limit-your-fps/

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.


The engine runs at about 1000 fps, and I can already hear my graphics card squealing

That's just of coil whine, and it's a sign of either low-quality components in your GPU, or an unstable power supply that's about to fail and take your system with it.

Either way, it has very little to do with the fact that you are rendering at 1,000 fps. It's just that at those framerates you are spinning up the GPU, but without enough work to heat it up and speed up the fans (which generally overwhelm the sound of coil whine).

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Thank you all for your help. I currently have it working without using vsync, using this:


while(profiler->fpsUnlimited * 1000.0 < 3.0)
{
    profiler->TimeStampStartAccumulate(PROFILER_TIMESTAMP_FPS_UNLIMITED);
    timeBeginPeriod(1);
    Sleep(1);
    timeEndPeriod(1);
    profiler->TimeStampEndAccumulate(PROFILER_TIMESTAMP_FPS_UNLIMITED);

    profiler->fpsUnlimited += profiler->timeStamps[PROFILER_TIMESTAMP_FPS_UNLIMITED];
}

However, I will also attempt to get this to work with vsync and try to fix the input lag issue.

View my game dev blog here!

This topic is closed to new replies.

Advertisement