• Advertisement
Sign in to follow this  

Why is using this specific time so reliable in Java game programming

This topic is 1655 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So I have been building this game off an existing code that calculates the change in time for me. What I do not get is when is this change in time using this specific approach? The code is as follows:

private long lastTick; 
public Game() 

{ // load game objects here // tell me the moment in time during the object creation process 
lastTick = System.currentTimeMillis(); 
}

public void run() { 
            while(isRunning) 
     { // repeat the bottom process indefinitely 
       // tell me the current time when the game runs in a loop.
      // subtract off the object creation time from the current time in the loop.
      // By subtracting I know how much time 
      // has passed since the game objects were created. 
       long milliseconds = System.currentTimeMillis() - lastTick;
      // update the time 
      lastTick = System.currentTimeMillis(); 
      }
}

Every game code written in Java from what I seen so far used the above code. What I do not get is when is the time called from the currentTimeMillis method so reliable? Given the common approach, I assume it is the most reliable, but why? I did research on the method and the way the API words it, it would seem the time is coming from the operating system. So is it safe to say there is a clock built-in to the computer? If true, then why is the time from the operating system so reliable for game programming.

 

I use the milliseconds variable for my update method from each game object that needs to move on-screen. The movement runs so smooth but I am not sure why. It is definitely a funky concept.

 

Source: public static long currentTimeMillis()

Returns the current time in milliseconds. Note that while the unit of time of the return value is a millisecond, the granularity of the value depends on the underlying operating system and may be larger. For example, many operating systems measure time in units of tens of milliseconds.

See the description of the class Date for a discussion of slight discrepancies that may arise between "computer time" and coordinated universal time (UTC).

Returns: the difference, measured in milliseconds, between the current time and midnight, January 1, 1970 UTC.

Edited by warnexus

Share this post


Link to post
Share on other sites
Advertisement


there is a clock built-in to the computer?

 

yes

 


why is the time from the operating system so reliable for game programming.

 

its not:

 


Note that while the unit of time of the return value is a millisecond, the granularity of the value depends on the underlying operating system and may be larger.

 

for windows, the granularity is 16 ms. so it only returns the time to the closest 16 ms interval.

 

this is why a timer based on QueryPerformanceCounter is preferred for hi resolution timers.

 

in your case, it sounds like its accurate enough for your needs.

 

i personally only need QueryPerformanceCounter for profiing timers which are accurate to the ms or tick count. other than that system time is sufficient for most things. but since i needed it for profiling, i made my timer system use it. now everything is accurate to the MS or tick count.

 

 

 

 

 

 

Share this post


Link to post
Share on other sites

Given the common approach, I assume it is the most reliable, but why?

System.nanoTime() is more reliable than currentTimeMillis(). The former is required to be the most precise implementation on a given system, whilst the latter is not.

I did research on the method and the way the API words it, it would seem the time is coming from the operating system. So is it safe to say there is a clock built-in to the computer? If true, then why is the time from the operating system so reliable for game programming.

Yes there is a clock built in (have you not noticed your computer can tell you the date and time?). Its resolution is certainly sufficient for games (I think nanoTime(} is accurate to around ~1?s).

Share this post


Link to post
Share on other sites

I use the milliseconds variable for my update method from each game object that needs to move on-screen. The movement runs so smooth but I am not sure why. It is definitely a funky concept.


I recommend you read about different kinds of game loops and fixed timesteps. That may help clarify things for you.

Also, you need to understand the problem with System.currentTimeMillis. On Windows, it has 16ms granularity as Norman Burrows mentioned above. For situations where you need the more accurate timing that the Windows API function QueryPerformanceCounter gives you, you might want to look into System.nanoTime.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement