Jump to content

  • Log In with Google      Sign In   
  • Create Account


We have 4 x Pro Licences (valued at $59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.

Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!


Member Since 19 Nov 2010
Offline Last Active Nov 19 2012 04:15 PM

Topics I've Started

Float Precision Issue?

17 November 2012 - 01:40 PM

Previously I posted about a rotation issue with my camera. I thought I had solved the problem by adding frame time to the camera’s movement calculation but unfortunately this only masked the real problem, which I think is a precision issue with floats. I noticed this when I rotate not only the camera but other objects as well.

I striped my render code down to the basics in an effort to identify the issue:

[source lang="cpp"]Camera CalculationsCameraTarget.X = CameraPosition.X + Sin(vRadians)CameraTarget.Y = CameraPosition.Y + (Cos(vRadians) * Cos(hRadians))CameraTarget.Z = CameraPosition.Z + (Cos(vRadians) * Sin(hRadians))CameraUp.X = 0;CameraUp.Y = 1;CameraUp.Z = 0;Update View Matrix:D3DXMatrixLookAtLH(View_Matrix, CameraPosition, CameraTarget, CameraUp);[/source]

To make things as simple as possible I’ll add 0.002 to the hRadians float each frame. This makes the camera rotate to the left but when I slow the frame rate down I can clearly see the camera is not turning the same amount each frame. This also happens when I calculate rotations for the world matrix before rendering 3D objects too. When rendering at 60 FPS and adding an easing calculation to movements it’s not as noticeable for moving forward, backwards, left and right but when rotating, it is very noticeable. Even with the easing code added you can tell the rotations are accelerating and decelerating, sometimes jerky.

I’ve tried to truncate the calculations for the CameraTarget and CameraPosition to regain some precision but to no avail, the issue still persists.

Is this a precision issue? And if so, how can I get more precision for rotations? If it’s a not a precision problem then what is it?

Random Jitter When Rotating Camera

30 October 2012 - 03:24 PM

I’ve got an issue where my camera jitters when I rotate left or right. It’s random so pinpointing the problem has been difficult. Using PIX I can see that there is a drop in frame rate when the jittering occurs. It’s not a constant jitter it’s like a frame or 2 are randomly taking longer to render. Right afterwards there is a jump in frame rate according to PIX but since I lock my FPS to 60 in my render loop I do not notice the upswing.

What is even more peculiar is that I don’t have to render anything to reproduce the issue. Just the camera in the world, no mesh, no sky, etc and when I rotate left or right, randomly the frame rate dips. If I don’t VSync and I don’t lock my FPS down the jittering (lag) is more noticeable. I don’t see an issue moving forward or backwards, only rotating.

I have a damping mechanism in place for my camera movement. I thought it might be the culprit so I removed everything in my camera class except the rotation’s calculations. I’m not at my desk so this is pseudo code:

CameraTarget.X = CameraPosition.X + (radius * Sin(vRadians))
CameraTarget.Y = CameraPosition.Y + (radius * Cos(vRadians)  * Cos(hRadians))
CameraTarget.Z = CameraPosition.Z + (radius * Cos(vRadians) * Sin(hRadians))

CameraUp.X = CameraPosition.X – CameraTarget.X
CameraUp.Y = ABS(CameraPosition.Y + (radius * Sin(vRadians + PI / 2)))
CameraUp.Z = CameraPosition.Z – CameraTarget.Z

After removing the damping code the issue still remains. It’s also worth noting that in PIX I do not have any memory or thread changes.

Any input would be greatly appreciated.


Move Object Based on Camera's Direction

03 September 2012 - 11:18 AM

I have an object that I would usually move by adding to its position vector. If I wanted to move it forward, I’d add to its position vector’s Z value. If I want to move it backward, I’d “subtract” from its Z value and so on. I'm in a situation now were I need to move the object forward based on the camera’s looking direction. If the camera is looking left then I need to subtract from the objects position vector’s X value, etc...

To calculate the position, I thought I’d simply put the position of the object into a new vector3, add to it’s z value like before, and then transform that vector by the view matrix and then set the object's position vector to the new vector3- however, this is not working as expected.

How can I transform my move (position) vector based off the camera’s looking direction?

Trajectory / Matrices

24 July 2012 - 03:58 PM

I have a collection of bullet particles. Each frame I update their position using a velocity vector. The bullets move from the camera (gun) outwards regardless of the movement of the camera which is correct, but when the camera moves -the bullets in the distance move with it, which is incorrect. They should continue on their trajectory (driven by their velocity) regardless of the camera’s movement.

I think the problem is the world transform matrix. It attaches a mesh and the particle emitter to the camera, however it also causes the bullets that have already been shot to move with the camera. The bullets should maintain their same trajectory after they’ve left.

Storing the world matrix for each particle is impractical but this gives you an idea of what I’m trying to accomplish. If each time a particle was added to the collection I stored the current world matrix then I could transform the particles position to maintain its trajectory. The question is - how I can I get the particles to maintain their trajectory without storing the world matrix for each particle?

Performance Issue with Size

19 July 2012 - 07:56 AM

I have a repeater type particle engine that uses quads in a dynamic vertex buffer. Each frame the position in the buffer is updated, with the shader calculating the vertices position based off the view matrix to face the camera. Inside this vertex shader the scaling is performed per vertex (scaling each particle independently).

When increasing the scaling variable significantly, the frame rate starts to drop. This is especially noticeable the closer the camera gets to the particle. The larger the scale variable – the larger the frame rate drops.

At first I thought this might be a filtering issue so I set all the filters in the shader to NONE. This did not solve the issue so I thought it could be the auto generated mipmaps so I disabled those but this did not solve the issue either.

I’ve noticed there is some sort of threshold with the scaling variable. For example, I can have a texture that is 64 x 64 px for one particle and another with a texture that is 128 x 128. The 128 particle will accept a larger scale factor before dropping the frame rate.

I can put a ceiling on scaling and max size in the engine but I’d like to understand this behavior first.