Sign in to follow this  

Jumpy application

This topic is 3739 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I have a big problem with my application. I have written a little direct3d framework to encapsulate common tasks like the renderloop and so an (basically its like the directx common framework). I've also implemented a camera class to move around, but now I've noticed that the movement is really jumpy. But even without any camera movement and with just a single spinning triangle the application moves "jumpy". The triangle doesnt rotate smooth, it seems like that every few seconds the rotation goes a little bit faster and then again it rotates normal. I have tried to figure out the problem for days but I just dont know why it moves so laggy. It cant be the framerate, since my app is running with 300-1000 fps. The most important parts of my code which (I guess) could relate to the problem are: the render loop:
while(true) {
   if(PeekMessage(&msg, NULL, 0, 0, PM_REMOVE)) {
      if(msg.message == WM_QUIT) 
         break;
      TranslateMessage(&msg);
      DispatchMessage(&msg);
   }
   onMove();
   onRender(fpsCounter);

   passedTime = timer.GetElapsedTime();
   passedTimeTotal = timer.GetTime();

   passedTimeWithinFrame += passedTime;
   if(passedTimeWithinFrame >= 1.0f) {
      passedTimeWithinFrame = 0;
      fps = fpsCounter;
      fpsCounter = 0;
   }
}




Really nothing special. onMove() calls Move() and onRender() calls the Render() Method. passedTime is the time between 2 frames, passedTimeTotal is the total time the app is running. The rest is just the calulation of the framerate. the render method:
virtual void render() {	
   if(FAILED(d3d9Device->Clear(0, NULL, D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER, D3DCOLOR_XRGB(100, 100, 200), 1.0, 0)))
   MessageBox(NULL, "FehlerA", "Fehler", MB_OK);

   d3d9Device->BeginScene();
		
   D3DXMATRIX world;
   D3DXMatrixRotationY(&world, 3.14f*0.25f * passedTimeTotal);
   d3d9Device->SetTransform(D3DTS_WORLD, &world);
	
   d3d9Device->DrawPrimitiveUP(D3DPT_TRIANGLELIST, 1, data, sizeof(Vertex));

   d3d9Device->EndScene();
   d3d9Device->Present(NULL, NULL, NULL, NULL);
} 




Again nothing special. Im just drawing one rotating triangle. Since the movement of the triangle depends on passedTimeTotal I checked the values of passedTime (time between 2 frames. passedTimeTotal is just the sum of all passedTime) and got these results:
0.000276851
0.000193041
0.000183543
0.000190248
0.000267632
0.00379741
0.000319873
0.000199467
0.000185219
0.000190527
0.000390832
0.0013552
0.00018913
0.000358146
0.000302832
0.000185219
0.000192483
0.000298921
0.000438044
0.000410667
0.000423238
0.000419606
0.000430781
0.000421283
0.000436648
0.000526045
0.000340825
0.000400889
0.000432178
0.000419886
0.000424635
0.0028526
Most of the time passedTime has values like 0.000X, but sometimes the value gets bigger (for example 0.0028) which means more time passed between two frames. Could these "jumpy" times between two frames be the problem? but even if: what can I do against it? I hope someone was faced with the same problem or has an idea! This drives me cracy:(( thanks! PS: I have uploaded the VS Project Files and the exe file: http://uploaded.to/?id=cn684d (I compiled with august 07 SDK) [Edited by - schupf on September 15, 2007 11:19:07 AM]

Share this post


Link to post
Share on other sites
I would look at your timer class. Is it homemade, or are you using another framework? I worked with VB for a long time, and the VB Timer object was notoriously inaccurate and should never be used for small increments. This one may be similar. There are other functions, like QueryPerformanceCounter, for high-precision timing. (That one is Win32-specific, but since you're using DirectX . . .)

Share this post


Link to post
Share on other sites
I copied the Timer Class from the DX Common Framework and made some little unimportant changes. This (and thus my) class internally uses QueryPerformanceCounter.

Share this post


Link to post
Share on other sites
I have had a lot of problems with timing too using several timing methods and still get jumps. I read somewhere that a good way is to use 3 different timing methods and average between them but im sure there must be a better way than that. Are you using a multi-core or multi-processor computer? because i believe that can cause inaccuracies in QueryPerformaceCounter.
Wish i could be more help.

Share this post


Link to post
Share on other sites
You could try averaging your N previous time delta's, that will filter out any bumps.

All you need to do is use a circular array of size N, each frame adding the newest delta and replacing the oldest. Then you can simply sum all the elements in the array and divide by N.

Share this post


Link to post
Share on other sites
Thanks for all your help! I tried the recommendation from MJP and I think it is smoother now (but still not perfectly smooth). I wanted to ask if this method (interpolating the between many times to dampen the "outlier") is standard in graphic applications. Cause I am still not sure if my big time value jumps are ok (why are there so big jumps? I mean its just a while loop and the methods should be called in approximately the same time).

Thanks

Share this post


Link to post
Share on other sites
Quote:
Original post by schupf
Thanks for all your help! I tried the recommendation from MJP and I think it is smoother now (but still not perfectly smooth). I wanted to ask if this method (interpolating the between many times to dampen the "outlier") is standard in graphic applications. Cause I am still not sure if my big time value jumps are ok (why are there so big jumps? I mean its just a while loop and the methods should be called in approximately the same time).

Thanks


I'm not sure if its a standard technique or not, I'm just an ameteur myself. I'd always used it for calculating my FPS. If there's better techniques for this issues, I'd like to hear them too!

However I can tell you that it is a very common signal processing technique, since its just a low-pass filter. It's analogous to sampling a few adjacent pixels in an image and averaging them together, which is done for blurring, anisotropic filtering, and anti-aliasing.

Share this post


Link to post
Share on other sites
I've had this problem when using the performance counter before, and I don't think I figured out why it made things jumpy. I switched over to timeGetTime() and made sure to call timeBeginPeriod(1) and timeEndPeriod(1) around the frame rendering loop. I figured that the accuracy of the performance counter was a bit of overkill, and now the application doesn't look jumpy.

I'd also like to mention that if you do use this method, don't convert the values from timeGetTime() into a float right away, do it only to the elapsed frame time value... because if the timeGetTime() values start getting high and you try to get the difference between two floats, you may have lost precision and the framerate will get jumpy.

Share this post


Link to post
Share on other sites
thanks MJP i tried that too and it does work much better.

schupf are you clamping too between your minium and maxiumum expected frame rate? Or you could just average over more frames.

I used timeGetTime too in another application and it was fine but it wasn't in another so i think playing with as many timer methods as you can get your hands on (and have time for) is best thing to do and see what works best.

Share this post


Link to post
Share on other sites
No I am not clamping. To be honest I also dont know how I should do that. How should I find out appropriate boundaries?
Could anyone of you (who thinks he has a smooth timing method) post the relevand timing code (I guess the render loop)?

Share this post


Link to post
Share on other sites
I dont have the code available at the moment but I will try and extract some code and post again. To be honest i am just using QueryPerformace counter with clamping and the averaging method MJP mentioned. I am sure other people have much better timing methods. Using an average between QueryPerformanceCounter and timeGetTime could also help.

To clamp you just decide your minimum acceptable frame rate (not really a need for a maximum) which if was 10 FPS then would be 0.1 seconds.


if (timeDelta > 0.1f) timeDelta = 0.1f; // minium frame rate
if (timeDelta < 0.0f) timeDelta = 0.0f; // ensure no negative values are returned (it can happen)



Share this post


Link to post
Share on other sites

This topic is 3739 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this