Sign in to follow this  
KaiSkews

Slope problem

Recommended Posts

KaiSkews    102
Hello all you game programmers,

I've got a small problem you may help me out with. I'm designing a small game, actually for learning purpose only. It's a 2D one-background game (style Picking Sticks), where you can run around and shoot bullets to enemys as you have a small crosshair controlled by mouse and the character controlled by keyboard.

My problem is the following: As i shoot the bullets, they go faster when the slope of their travelling direction goes either too far up or too far down.

To express this mathematically:

Character (Xa, Ya)
Crosshair (Xb, Yb)

DirectionVertex_X = Xb-Xa
DirectionVertex_Y = Yb-Ya

DirectionVertex_Slope = DirectionVertex_Y/DirectionVertex_X

Now that we have the DirectionVertex we can calculate every step of the bullet:


Bullet->x += 1
Bullet->y += DirectionVertex_Slope*1

maybe you can see the problem already.

the more Y drifts away from X the greater the steps for Y become when they stay constant for X.

so the speed becomes greater the further I aim up or down, and smaller the more X and Y are equal.

I hope you understand my problem and may find an answer to it, maybe with a completely different approach.

Thanks in advance

Share this post


Link to post
Share on other sites
deftware    1778
normalize your step increment vector:

[code]
float length, stepx, stepy;

stepx = Xb-Xa;
stepy = Yb-Ya;

length = sqrt(stepx * stepx + stepy * stepy);
stepx /= length;
stepy /= length;

Bullet->x += stepx;
Bullet->y += stepy;
[/code]

tada!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this