Jerky movement using Java2D

Started by
4 comments, last by altich88 11 years, 3 months ago

public void run() {
		
		long desiredFPS = 500;
                long desiredDeltaLoop = (1000*1000*1000)/desiredFPS;
		
		long beginLoopTime;
		long endLoopTime;
		long currentUpdateTime = System.nanoTime();
		long lastUpdateTime;
		long deltaLoop;
		int deltaMs = 0;
		
		while(isRunning){
			beginLoopTime = System.nanoTime();
			
			window.panel.repaint();
			
			lastUpdateTime = currentUpdateTime;
			currentUpdateTime = System.nanoTime();
			
			onUpdateGame(deltaMs);
			
			deltaMs = (int) ((currentUpdateTime - lastUpdateTime)/(1000*1000));
			
			endLoopTime = System.nanoTime();
			deltaLoop = endLoopTime - beginLoopTime;
	        
	                if(deltaLoop > desiredDeltaLoop){
				//Do nothing. We are already late
			} else {
				try{
				      Thread.sleep((desiredDeltaLoop - deltaLoop)/(1000*1000));
	                        } catch (InterruptedException e) {
					//Do nothing
	                        }
	                     
	                }
		}
}

Hi,

I'm trying to implement a 2D game using the Java2D API. I'm using the code posted below in my run() method of my main thread to limit the framerate to a specific maximum.

Objects are moved by a distance calculated from the elapsed time between frames. The movement of said objects, however is noticeably jerky.

Could the problem be to do with the method used below? I've seen this type of thing used in several introductory examples to games in Java.

Many thanks.

Advertisement

deltaMs = (int) ((currentUpdateTime - lastUpdateTime)/(1000*1000));

In this line, the division is not a floating-point division because both operators are integers (long). You need to promote either operator (or both) to a floating-point type for the division to change from integer to floating-point, otherwise your integer cast will not really round anything - the result will be truncated down from the integer division, which'll cause the simulation to sometimes "jump" slightly forward nondeterministically depending on the current and last update times. It'll also overflow, for sufficiently large update times... but that shouldn't be a problem here.

Though your choice of deltaMs being an integer is puzzling. That automatically limits your accuracy to one millisecond, which will cause trouble if your framerate is not a multiple of 100. For instance, 30 frames per second is not exactly 33 milliseconds - it's 33.333... That'll cause you to "gain" one millisecond every three frames or so. Any reason you can't use a double as a delta time for your game?

Usually, playing with nanoseconds, milliseconds and seconds at the same time is an easy way to get lost in these precision issues. I recommend just using doubles everywhere and enforcing a unit of seconds for consistency.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Thanks for your reply. I have changed the delta argument to a double but unfortunately the jerky movement persists.

When you suggested not mixing nanoseconds/milliseconds/seconds, how would I avoid this? Presumably I need to use some sort of milli/nanosecond timer at some point and convert the returned value to seconds?

As well as the new code I've also posted an example of how an object's position is updated and then drawn using the delta argument in case there is an error in the logic on that end too.

The object's position is stored and calculated using doubles, but when it comes to drawing it I am forced to give integer x and y arguments to specify the pixel position - could this rounding lead to jerky movement?

Many thanks for your help so far.


public void run() {
		long desiredFPS = 60;
		long desiredDeltaLoop = (1000*1000*1000)/desiredFPS;
		
		long beginLoopTime;
		long endLoopTime;
		long currentUpdateTime = System.nanoTime();
		long lastUpdateTime;
		long deltaLoop;
		double deltaSec = 0;
		
		while (isRunning) {
			beginLoopTime = System.nanoTime();

			window.panel.repaint();

			lastUpdateTime = currentUpdateTime;
			currentUpdateTime = System.nanoTime();
			
			onUpdateGame(deltaSec);

			deltaSec = ((currentUpdateTime - lastUpdateTime) / 1000000000.0);
			
			endLoopTime = System.nanoTime();
			deltaLoop = endLoopTime - beginLoopTime;

			if (deltaLoop > desiredDeltaLoop) {
				// Do nothing. We are already late
			} else {
				try {
					Thread.sleep((desiredDeltaLoop - deltaLoop) / 1000000);
				} catch (InterruptedException e) {
					// Do nothing
				}
			}
		}
}


// position.x and position.y are doubles and 100 is the desired pixels/s to move
position.x += (GameApp.deltaMs * 100.0);
position.y += (GameApp.deltaMs * 100.0);

g2d.translate(position.x, position.y);
g2d.setColor(Color.GREEN);

g2d.fillRect(0, 0, 100, 100);
The object's position is stored and calculated using doubles, but when it comes to drawing it I am forced to give integer x and y arguments to specify the pixel position - could this rounding lead to jerky movement?

No, I don't think so, not with the drawing code shown. It must be coming from the game loop.

Could be the Thread.sleep() method. If you're under Windows, thread pausing facilities are far from realtime and will not usually sleep for exactly the amount you specified, but a few milliseconds more. Can you try an approach where you always loop, and just never update until the time delta exceeds the framerate delta (1 / framerate seconds), without sleeping? It will use more CPU, but if the jerkiness is gone, you'll have identified the problem and can work on solving it.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

The object's position is stored and calculated using doubles, but when it comes to drawing it I am forced to give integer x and y arguments to specify the pixel position - could this rounding lead to jerky movement?

No, I don't think so, not with the drawing code shown. It must be coming from the game loop.

Could be the Thread.sleep() method. If you're under Windows, thread pausing facilities are far from realtime and will not usually sleep for exactly the amount you specified, but a few milliseconds more. Can you try an approach where you always loop, and just never update until the time delta exceeds the framerate delta (1 / framerate seconds), without sleeping? It will use more CPU, but if the jerkiness is gone, you'll have identified the problem and can work on solving it.

In windows if you're going to spin wait I find it helps to throw in a call to 'SwitchToThread()', which tells the OS to transfer control to the next pending thread and then switch back to the caller.

void hurrrrrrrr() {__asm sub [ebp+4],5;}

There are ten kinds of people in this world: those who understand binary and those who don't.
Thanks for the replies, I think you're right in your thinking regarding the capped fps/sleeping thread theory. Everything runs much smoother with an unlimited frame rate.

Interestingly I've noticed a similar problem whilst using Slick2D as well. As a quick fix for know I've set the limit to 200 so at least the GPU isn't spinning up too much. The lag spikes are now much less frequent.

I'll try it on my work laptop after Christmas to see how a completely different graphics card behaves.

Edit: As another point of interest, watching a video (YouTube in chrome) whilst playing the game causes serious lag issues, whilst simply pausing the video seems to be enough to avoid this.

This topic is closed to new replies.

Advertisement