Slope problem

Started by
1 comment, last by KaiSkews 12 years, 9 months ago
Hello all you game programmers,

I've got a small problem you may help me out with. I'm designing a small game, actually for learning purpose only. It's a 2D one-background game (style Picking Sticks), where you can run around and shoot bullets to enemys as you have a small crosshair controlled by mouse and the character controlled by keyboard.

My problem is the following: As i shoot the bullets, they go faster when the slope of their travelling direction goes either too far up or too far down.

To express this mathematically:

Character (Xa, Ya)
Crosshair (Xb, Yb)

DirectionVertex_X = Xb-Xa
DirectionVertex_Y = Yb-Ya

DirectionVertex_Slope = DirectionVertex_Y/DirectionVertex_X

Now that we have the DirectionVertex we can calculate every step of the bullet:


Bullet->x += 1
Bullet->y += DirectionVertex_Slope*1

maybe you can see the problem already.

the more Y drifts away from X the greater the steps for Y become when they stay constant for X.

so the speed becomes greater the further I aim up or down, and smaller the more X and Y are equal.

I hope you understand my problem and may find an answer to it, maybe with a completely different approach.

Thanks in advance
Advertisement
normalize your step increment vector:


float length, stepx, stepy;

stepx = Xb-Xa;
stepy = Yb-Ya;

length = sqrt(stepx * stepx + stepy * stepy);
stepx /= length;
stepy /= length;

Bullet->x += stepx;
Bullet->y += stepy;


tada!
Woohoo, thanks :D works godlike. Lovely. Thank you!!

This topic is closed to new replies.

Advertisement