Jump to content

  • Log In with Google      Sign In   
  • Create Account

#ActualBacterius

Posted 12 November 2013 - 04:34 AM

EDIT: some users have noted that the approach I suggested previously is not optimal. On further reflection I agree and have retracted it (I have left the explanation of why the OP's code is not behaving as expected).

 

---

 

Your problem is that Sleep(1). That's defeating the whole point of using an accurate timer, because Sleep() is less accurate than QPC. So by using it you have lowered the accuracy of your game loop to the accuracy of the Sleep() function, which is equivalent to not having QPC at all.


#5Bacterius

Posted 12 November 2013 - 04:32 AM

EDIT: some users have noted that the approach I suggested previously is not optimal. On further reflection I agree and have retracted it (I have left the explanation of why the OP's code is not behaving as expected).

 

---

 

Your problem is that Sleep(1). That's defeating the whole point of using an accurate timer, because Sleep() is less accurate than QPC. So by using it you have lowered the accuracy of your game loop to the accuracy of the Sleep() function, which is equivalent to not having QPC at all.


#4Bacterius

Posted 12 November 2013 - 03:40 AM

Your problem is that Sleep(1). That's defeating the whole point of using an accurate timer, because Sleep() is NOT accurate. What you should do is write a tight loop (without any sleeps) where you check the time elapsed since the last time you drew a frame using QPC, and if that elapsed time is greater than 1/60 seconds, THEN draw a new frame and update your "last time" variable. Something like this (modified to your needs, obviously):

double last_time = 0;

while (running)
{
    double current_time = QueryPerformanceCounter(); // pseudocode, you get the idea

    if (current_time - last_time > 1.0 / 60.0)
    {
        // your code goes here
        last_time = current_time;
    }
}

And for people complaining that "it takes all the CPU", relax - first, the CPU is there to be used, and secondly, the CPU is not doing very much work at all when just waiting for the delta time to elapse to draw the next frame (so it won't overheat or anything - it's not as if it were running Linpack in the meantime). Context switches are expensive, let the operating system handle these details for you and write code without worrying about them - it knows what it's doing better than you do. If you really cannot deal with this, then you can always try and change the timer resolution with timeBeginPeriod(), to, for instance, 1 ms, and then use Sleep(), which will give some better results, but be warned that this may still cause your framerate to jitter to up to +- 1 ms if your thread happens to be sleeping as you cross the 16.67ms threshold.


#3Bacterius

Posted 12 November 2013 - 03:40 AM

Your problem is that Sleep(1). That's defeating the whole point of using an accurate timer, because Sleep() is NOT accurate. What you should do is write a tight loop (without any sleeps) where you check the time elapsed since the last time you drew a frame using QPC, and if that elapsed time is greater than 1/60 seconds, THEN draw a new frame and update your "last time" variable. Something like this (modified to your needs, obviously):

double last_time = 0;

while (running)
{
    double current_time = QueryPerformanceCounter(); // pseudocode, you get the idea

    if (current_time - last_time > 1.0 / 60.0)
    {
        // your code goes here
        last_time = current_time;
    }
}

And for people complaining that "it takes all the CPU", relax - first, the CPU is there to be used, and secondly, the CPU is not doing very much work at all when just waiting for the delta time to elapse to draw the next frame (so it won't overheat or anything - it's not as if it were running Linpack in the meantime). Context switches are expensive, let the operating system handle these details for you and write code without worrying about them - it knows what you do better than you do. If you really cannot deal with this, then you can always try and change the timer resolution with timeBeginPeriod(), to, for instance, 1 ms, and then use Sleep(), which will give some better results, but be warned that this may still cause your framerate to jitter to up to +- 1 ms if your thread happens to be sleeping as you cross the 16.67ms threshold.


#2Bacterius

Posted 12 November 2013 - 03:39 AM

Your problem is that Sleep(1). That's defeating the whole point of using an accurate timer, because Sleep() is NOT accurate. What you should do is write a tight loop (without any sleeps) where you check the time elapsed since the last time you drew a frame using QPC, and if that elapsed time is greater than 1/60 seconds, THEN draw a new frame and update your "last time" variable. Something like this (modified to your needs, obviously):

double last_time = 0;

while (true)
{
    double current_time = QueryPerformanceCounter(); // pseudocode, you get the idea

    if (current_time - last_time > 1.0 / 60.0)
    {
        // your code goes here
        last_time = current_time;
    }
}

And for people complaining that "it takes all the CPU", relax - first, the CPU is there to be used, and secondly, the CPU is not doing very much work at all when just waiting for the delta time to elapse to draw the next frame (so it won't overheat or anything - it's not as if it were running Linpack in the meantime). Context switches are expensive, let the operating system handle these details for you and write code without worrying about them - it knows what you do better than you do. If you really cannot deal with this, then you can always try and change the timer resolution with timeBeginPeriod(), to, for instance, 1 ms, and then use Sleep(), which will give some better results, but be warned that this may still cause your framerate to jitter to up to +- 1 ms if your thread happens to be sleeping as you cross the 16.67ms threshold.


#1Bacterius

Posted 12 November 2013 - 03:37 AM

Your problem is that Sleep(1). That's defeating the whole point of using an accurate timer, because Sleep() is NOT accurate. What you should do is write a tight loop (without any sleeps) where you check the time elapsed since the last time you drew a frame using QPC, and if that elapsed time is greater than 1/60 seconds, THEN draw a new frame and update your "last time" variable. Something like this (modified to your needs, obviously):

double last_time = 0;

while (true)
{
    double current_time = QueryPerformanceCounter(); // pseudocode, you get the idea

    if (current_time - last_time > 1.0 / 60.0)
    {
        // your code goes here
        last_time = current_time;
    }
}

And for people complaining that "it takes all the CPU", relax - first, the CPU is there to be used, and secondly, the CPU is not doing very much work at all when just waiting for the delta time to elapse to draw the next frame (so it won't overheat or anything - it's not as if it were running Linpack in the meantime). Context switches are expensive, let the operating system handle these details for you and write code without worrying about them.


PARTNERS