Advertisement Jump to content
Sign in to follow this  


This topic is 4970 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi! I'm trying to make a camera. Can anyone point out what I'm doing wrong. It's perhaps more math then programming. I want to move in the direction v = [0, 0, S, 0], in local coordinates for the camera. Then I need to rotate that vector so I get it in global coordinates. The angles I want to rotate with is yaw, pitch and roll. Say my rotation matrix is M. Then I get Mv = [Sm13, Sm23, Sm33, Sm43]. So I only need to care about the third row in the matrix. I should multiply the seperate matricies in the order Yaw * Pitch * Roll. Then I get: m13 = sin Yaw * cos Roll + cos Yaw * sin Roll * sin Pitch m23 = cos Yaw * cos Roll * sin Pitch - sin Yaw * sin Roll m33 = cos Yaw * cos Pitch m43 = 0 My code which is trying to do this:
void movePlane(float secsPerFrame)
    GLdouble m13, m23, m33;
    m13 = sin(camera.heading) * cos(camera.roll) + cos(camera.heading) * sin(camera.roll) * sin(camera.pitch);
    m23 = cos(camera.heading) * cos(camera.roll) * sin(camera.pitch) - sin(camera.heading) * sin (camera.roll);
    m33 = cos(camera.heading) * cos(camera.pitch);
    camera.x += secsPerFrame * -speed * m13;
    camera.y += secsPerFrame * -speed * m23;
    camera.z += secsPerFrame * -speed * m33;
I don't get to proper result. The plane takes off when I increase the pitch, for instance. :)

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!