Jump to content
  • Advertisement
Sign in to follow this  
BradDaBug

Unity Thread timing

This topic is 4752 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

In my game I've got two threads, a server thread and a client thread. The server thread does all the logic and collision stuff, the client mostly renders. The trouble I'm having is getting the two threads to share the CPU. The client runs in the main thread and the server runs in a thread I create. I'm using SDL_Delay() in the server thread to make it give up the CPU sort of like this:
// calculate how long the delay should be to get the
// thread to run around 100 Hz
delay = 10 - (SDL_GetTicks() - timeAtBeginningOfFrame);
if (delay > 0) SDL_Delay(delay);
Later in that thread I'm using fixed timesteps in ODE, like this:
Step()
{
     remainder += SecondsPerFrame();

     while(true)
     {
          if (remainder >= 0.01) // 100 Hz
          {
               dWorldQuickStep(world, 0.01);
               remainder -= 0.01;
          }
          else { break; }
     }
}
The problem is when the program first starts out it's running at 100% CPU usage, and it's slow. The client thread is running at about 50 FPS and the server at about 40 FPS. But the wierd thing is if I click and drag on the game's title bar, pausing the client thread, then release it, when it starts running again the client is running at around 100 FPS and the server at about 90 FPS. What is going on?

Share this post


Link to post
Share on other sites
Advertisement
No one?

Let me present the problem like this. There's the server thread and the client thread. I don't really care how fast the client thread runs (other than as fast as possible), but the server, for best results, should run somewhere between 70 and 100 FPS. The way it is now the speed of the two threads seem to be dependent upon how fast the other thread is running. When the client thread is busy drawing lots of stuff on screen and the framerate drops, the framerate of the server thread also drops. When the client isn't drawing much at all the framerate of the server thread goes up. But what I want is the server thread framerate to be more or less constant.

How do I do it?

Share this post


Link to post
Share on other sites
You could lock them together and dispatch the thread. The server keeps track of it's frame rate and decides when to allow the client to execute. You can place multiple points in the client where it enqueues upon a shared object to allow the server to let it run for less than a full frame. The server should track the client's framerate so you don't starve the client for CPU.

Share this post


Link to post
Share on other sites
You need to give the server thread a higher priority. That way it always runs when it is ready, and the client thread only runs when the server thread is waiting -- or something like that. Careful... setting thread priorities can lead to problems. For example, if the server thread is always busy, then the client thread may never run and thus never receive windows messages -- or something like that.

Share this post


Link to post
Share on other sites
Trying to fix race conditions by changing thread priority is a dangerous strategy.

The best way to solve such problems is to implement thread synchronization by signalling the client to go into a waitstate (this way it doesn't consume (too much) CPU time) and make it wait for the server to respond (another signal) or a time-out.

cheers

Share this post


Link to post
Share on other sites
So basically the server controls when the client is able to run? Wouldn't you get the same result from the server and client running in the same thread, and the server calling client.Run() occasionally?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Quote:
Original post by BradDaBug
So basically the server controls when the client is able to run? Wouldn't you get the same result from the server and client running in the same thread, and the server calling client.Run() occasionally?


The short answer: yes.

Share this post


Link to post
Share on other sites
Not exactly. The code that passes control back to the server can be anywhere in the client thread and the client can resume exactly where it left off when the server passes control back. That's a little hard to manage in a single thread.

Share this post


Link to post
Share on other sites
Plus: by using a signal that isn't tied to the combination of server/client, you can pretty much signal any client/server. In your solution the server has to know (too much) about the client whereas using a signal would give you a more loosly coupled system.

Cheers

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!