At any rate, I am having some issues implementing a deltaTime variable.
I have this class"
public class Time
{
public static long timeFrameStarted = 0; // system time at start of frame
public static long timeOfStart; // system time of the start of the scene
public static float deltaTime = 0; //time it took to complete last frame
public static long frameCount = 0; // Number of rendered frames
}
and my update function called each from looks like so:
public void onDrawFrame(GL10 gl)
{
Time.timeFrameStarted = System.currentTimeMillis();
++Time.frameCount;
Log.e("IMPACT", "" + Time.deltaTime);
// TODO Auto-generated method stub
go.transform.position.x += 100 * Time.deltaTime;
go.update();
Time.deltaTime = (float)((System.currentTimeMillis() - Time.timeFrameStarted)/1000.f);
}
go is my GameObject. Everything else should be self explanatory.
At any rate, this code does work, however as the game object is moving across the screen, it will move smoothly, then jerk forward, move smoothly again, then jerk forward again, and so on.
I am not sure if it has something to do with my implementation of deltaTime or if I simply need to optimize my openGL code inside of my draw methods.
Here is a video of the issue in case it helps: