(C++) Limiting the Rate of Fire in my Asteroids Clone

Started by
6 comments, last by Side Winder 17 years, 1 month ago
How do I go about limiting the rate of fire? Right now I'm using 'break' to break from the loop, but even with that, if the user is holding down space bar (the firing button), another bullet will be fired almost immediately after so I need some way of limiting it down to ~2.0 bullets per second. Any ideas? Thanks.
Advertisement
Quick pseudocode:

const float time_between_bullets = 1.0 / 2.0; // Two per secondif (current_time - last_bullet_time > time_between_bullets) {  fire();  last_bullet_time = current_time;}else{  // Don't do anything, it's too soon}
I have recently found myself writing this code or similar so many times that, coincidentally, my coding task for today is to write a general timer class so I can handle this kind of thing easily.

(Using C++) One tip, and maybe some others will disagree: when I was recently writing a D3D point particle system I came across a problem whereby the GetTickCount() calls weren't accurate enough to keep up with very high frame rates, and that caused problems in my emission and dynamics code. I found that using QueryPerformanceCounter() and QueryPerformanceFrequency() gave the kind of accuracy I needed. Over a 1/2 second period the difference will probably not be significant, but the inaccuracy of the tick count overwhelmed the dynamics when I was working at e.g. 200fps.

Any experts have any comments on this? I'd be interested to see how others implement accurate timing.

L
-

Visit http://www.mugsgames.com

Stroids, a retro style mini-game for Windows PC. http://barryskellern.itch.io/stroids

Mugs Games on Twitter: [twitter]MugsGames[/twitter] and Facebook: www.facebook.com/mugsgames

Me on Twitter [twitter]BarrySkellern[/twitter]

Thanks for the fast replies guys. I've got it but it's not uniform rate of fire... Basically over a period of time the average rate of fire would be that but sometimes two bullets would be released at once, and then none for double the time.
Quote:Original post by PKLoki
Any experts have any comments on this? I'd be interested to see how others implement accurate timing.


It's been a long time since I've had to handle timer inaccuracies. Both of my frameworks are designed to work correctly even with a locally inaccurate timer: even if your timer has an error of 20 milliseconds (even GetTickCount usually gets 16 milliseconds precision), it's the same error margin at any time. So, if your computations rely only on the time elapsed from the start of the game, you will always have a 20-millisecond error, which is imperceptible to the user.

My basic framework uses a fixed timestep. The invariant is, after T seconds, the system will have evaluated F*T timesteps, where F is the timestep frequency. The only consequence of imprecision in this framework is that you may be displaying the wrong timestep (within your timer's error margin), but the data itself will always be deterministically consistent. The top-level loop in this kind of framework looks like this:

const Time step_delta = Time::ms(16); Time now = Time::get();for(;;){  while (now < Time::get())  {    game.update(now);    now += step_delta;  }  graphics.display(game);}


My advanced framework uses an internal scheduler which acts independently of time. The scheduler orders events by time of occurence, and executes them in that order, up to a "current time". A simplified example loop below:

for(;;){  game.read_input(scheduler.now);  while (scheduler.now < Time::get())  {    Event & e = scheduler.earliest();    scheduler.now = e.time;    e.execute();  }  graphics.display(game,Time::get());}
So in the first example you gave, if I'm reading this correctly, the system may perform more than one game 'loop' as it were, before drawing anything to screen? i.e. the game will do its logic as many times as required, for fixed timesteps, until the logic is no longer in the past, then it renders a frame. In that case if the game processing takes longer to run than the timestep it would never render a frame, correct? Otherwise, it will always render as many frames as it is able, but never render a frame if the game is not ready to be rendered.

L
-

Visit http://www.mugsgames.com

Stroids, a retro style mini-game for Windows PC. http://barryskellern.itch.io/stroids

Mugs Games on Twitter: [twitter]MugsGames[/twitter] and Facebook: www.facebook.com/mugsgames

Me on Twitter [twitter]BarrySkellern[/twitter]

That's correct. And this is probably what the OP's code is missing: a way to regulate cases where frames are rendered too often or not often enough.

Hm.. Yes. Well, maybe I'll sort it out in a few months when I have a clue what you're talking about :) As this is my first real game project I think I'll stick to what I've got for now and maybe make adjustments when I can.

This topic is closed to new replies.

Advertisement