Jump to content
  • Advertisement
Sign in to follow this  
linkingabo

OpenGL Continuous rendering of animations

This topic is 2575 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

[color=#333333]Hey there! Could anyone help me? This is my very first project in OpenGL. We have to make a 3D animation of a scene where a robot mixes the letters of our name and creates some other words from it.

My problem is: I have my animation split into some points, in each point something changes, like: the robot goes forward or takes a letter, e.g. in the next point places the taken letter to another place, BUT I dont know how to join these points/steps together using some timer function so that my animation could run smoothly. Because all now it does just starts with the first point, renders it, then after some given time delay (with SDL_Delay) it renders the other point/scene/step. So there are no connections between the scenes, though I have them written to count the connection and render the scene smoothly.. ://

Only I can see now is just the robot standing at point 1, standing next to the table at point 2, holding the letter at point 3, .. but there is no connectuon between the scenes, as the "walking to the table" could be between point 1 and 2.

please help me..

Share this post


Link to post
Share on other sites
Advertisement
What you're looking for is some form of interpolation. In your case you can probably just get away with linear interpolation. Consider: you have N points (vertices) in position A at fixed locations. You also have N points in position B, also at fixed locations. You aim is to move all points from position A to position B over some finite time (say, 1 second). In order to do this smoothly, you just match each point from position A to a matching point in position B and start drawing. Each time you draw your robot, you calculate how long you robot has been moving (use timeGetTime() in Windows for simple timing) and you interpolate all of the points so that they smoothly transition from A to B. Linear interpolation is done as:

x = (x1 - x0) * t + x0

Where, in your case, x is the current position of a vertex (what you need to calculate each frame), x0 is the position at A and x1 is its position at B. t is a fractional value from 0 to 1 (time index) that denotes how far along the current animation is.

So, if you want to transition from A to B over 1 second, then all you have to do is call timeGetTime() when the animation starts to know when your movement began, and call it again each frame to know how long has elapsed since then. timeGetTime() gives you a time index since system start in milliseconds, so you need to calculate:

t = (t_cur - t0) / 1000.f

to know what the current status of the robot's movement in the current animation sequence is. Where t is your fractional time index, t_cur is the return value of timeGetTime() this frame and t0 is the return value of timeGetTime() when the animation began.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!