• Advertisement
Sign in to follow this  

Slope problem

This topic is 2429 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello all you game programmers,

I've got a small problem you may help me out with. I'm designing a small game, actually for learning purpose only. It's a 2D one-background game (style Picking Sticks), where you can run around and shoot bullets to enemys as you have a small crosshair controlled by mouse and the character controlled by keyboard.

My problem is the following: As i shoot the bullets, they go faster when the slope of their travelling direction goes either too far up or too far down.

To express this mathematically:

Character (Xa, Ya)
Crosshair (Xb, Yb)

DirectionVertex_X = Xb-Xa
DirectionVertex_Y = Yb-Ya

DirectionVertex_Slope = DirectionVertex_Y/DirectionVertex_X

Now that we have the DirectionVertex we can calculate every step of the bullet:

Bullet->x += 1
Bullet->y += DirectionVertex_Slope*1

maybe you can see the problem already.

the more Y drifts away from X the greater the steps for Y become when they stay constant for X.

so the speed becomes greater the further I aim up or down, and smaller the more X and Y are equal.

I hope you understand my problem and may find an answer to it, maybe with a completely different approach.

Thanks in advance

Share this post

Link to post
Share on other sites
normalize your step increment vector:

float length, stepx, stepy;

stepx = Xb-Xa;
stepy = Yb-Ya;

length = sqrt(stepx * stepx + stepy * stepy);
stepx /= length;
stepy /= length;

Bullet->x += stepx;
Bullet->y += stepy;


Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement