if (behave == SHOOT)
{
//set direction of shot
vectorf dir = target - origin;
p->force = dir.Unit ();
//rotate shot to face target
if (dir.x > 0.0f)
{
rotate.y = atan2f (-dir.z, dir.x) * (180.0f / PI);
}
else
{
rotate.y = atan2f (-dir.z, -dir.x) * (180.0f / PI);
}
//tilt shot to face target
rotate.z = atan2f (dir.y, dir.x) * (180.0f / PI);
}
.
.
.
if (behave == SHOOT)
{
glRotatef (rotate.z, 0.0f, 0.0f, 1.0f);
glRotatef (rotate.y, 0.0f, 1.0f, 0.0f);
}
.
.
.
glBegin (GL_TRIANGLES);
//arrowhead
glColor3f (1.0f, 1.0f, 1.0f);
glVertex3f (0.7f, 0.0f, -0.1f);
glVertex3f (0.7f, 0.0f, 0.1f);
glVertex3f (1.0f, 0.0f, 0.0f);
//shaft
glColor3f (0.8f, 0.0f, 0.0f);
glVertex3f (0.7f, 0.0f, -0.05f);
glVertex3f (0.0f, 0.0f, -0.05f);
glVertex3f (0.0f, 0.0f, 0.05f);
glVertex3f (0.7f, 0.0f, -0.05f);
glVertex3f (0.0f, 0.0f, 0.05f);
glVertex3f (0.7f, 0.0f, 0.05f);
glEnd ();
[edited by - pacman on January 15, 2004 5:34:39 PM]
vector rotation problem
Hey there, I have myself a bit of a problem. I have this projectile (that looks like an arrow -> right now), and I'm trying to get the rotations down so that the arrow points to the target when it's fired. I've finally gotten it do that...most of the time. It seems that when the shooter and the target are lined up (or close to) on the X axis, the rotation gets funny, and instead of pointing upwards/downwards like it's supposed to, it twists. I'm assuming it has to do with the Z axis becoming the "twist axis" instead of the "tilt axis" (if you catch my drift), but I can't seem to get it to work. Here's some code, any help would be greatly appreciated.
You need to use the total length of the horizontal for calculating the pitch, atan2( y, sqrt( x*x + z*z) ), just using atan2( y, x ) doesn''t take the z value into consideration - because what if z were really big and x very small
Hmmm...thanks for the replay AP, but that didn''t seem to work.
I just tried something, and it seems that if I do either rotation by itself, they both do what they''re supposed to no matter where the shooter or target sit. It''s when I do both rotations that the bad condition shows up.
I just tried something, and it seems that if I do either rotation by itself, they both do what they''re supposed to no matter where the shooter or target sit. It''s when I do both rotations that the bad condition shows up.
I don''t recall how the gl rendering model works, but it appears to me that you might also be rotating about the z axis (pitch) prior to rotating about the y axis (yaw) which could cause some funny results.
// this is your pitch
glRotatef (rotate.z, 0.0f, 0.0f, 1.0f);
// this is your yaw
glRotatef (rotate.y, 0.0f, 1.0f, 0.0f);
Unless gl is a push down automaton, in which case this is correct, if not, try switching these two statements because typically you''ll always want to rotate about the y axis first.
// this is your pitch
glRotatef (rotate.z, 0.0f, 0.0f, 1.0f);
// this is your yaw
glRotatef (rotate.y, 0.0f, 1.0f, 0.0f);
Unless gl is a push down automaton, in which case this is correct, if not, try switching these two statements because typically you''ll always want to rotate about the y axis first.
Hey Simba, thanks for your idea. Changing the order did effect the rotation, but unfortunately it just caused another wierd effect. Now the arrows point straight up instead of twisting.
Maybe I''m going about this the wrong way? I''m determining the angle of rotation when the shot is created, and then before I call the display list to draw the arrow, I translate and rotate the arrow to accordingly. Perhaps there is a better way to do this, due to the multiiple rotations?
Maybe I''m going about this the wrong way? I''m determining the angle of rotation when the shot is created, and then before I call the display list to draw the arrow, I translate and rotate the arrow to accordingly. Perhaps there is a better way to do this, due to the multiiple rotations?
Hey, what do you know, I got it to work. I tried what both you guys suggested together, plus reorienting the arrow in the display list, and tweaking a couple angles, but now it works like a champ. Thanks you guys for helping me finally get over this! :D
Your definately on the right track. Althouh after looking at the code I use for getting pitch and yaw from a vector I''m not sure why you''re changing the sign on the x/z values of the direction vector for getting the yaw. Here''s a code snipet that might help. I''ve defined my own vector3 class and defined a GetYaw and GetPitch function as follows:
Hope this is a bit more helpful.
class vector3{public: float x; float y; float z; //////////////////////////////////////////////////////////// float vector3::GetYaw ( void ) const { return ( (float)atan2( x, z ) ); } //////////////////////////////////////////////////////////// float vector3::GetPitch( void ) const { float L = (f32)sqrt( x*x + z*z ); return( -(float)atan2( y, L ) ); } //////////////////////////////////////////////////////////// const vector3& vector3::operator = ( const vector3& rhs ) { x = rhs.x; y = rhs.y; z = rhs.z; return *this; }};//// Assume these are vector3''s and are set with your values//Target; // what were pointing atArrowPos; // current position// determine the vector from ArrowPos to our Targetvector3 Ray = Target - ArrowPos; // NormalizeLookRay.Normalize();// Now you can get the rotation simply by callingfloat Yaw = LookRay.GetYaw() * 180/PI;float Pitch = LookRay.GetPitch() * 180/PI;
Hope this is a bit more helpful.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement