Pacman clone: Performance varies significantly from play to play...

Started by
6 comments, last by Omelas0469 15 years, 4 months ago
Hi, I'm working on a pacman clone in C++ using GLUT for the graphics. I'm using VC++. When I build the solution and then after building I'll run it and close it several times and each time I run it the performance changes (the speed). In my game I have a timer that is set to update the positions of the characters every 30 ms. The characters move at a fixed rate of 1 unit per timer update. Why then do they move faster when I run it one time and then slower the next?! I'm not just talking about a couple of seconds where it will slow down or run smoothly, when it runs smoothly, it runs smoothly until I close it, and if its choppy/slow, it will be like that until I close it. Huh!?!??!
Advertisement
I'm unsure of the benefits and disadvantages of having a timer run your update loop.

One thing I always recommend is to perform movement using delta time to achieve framerate independence.

"The right, man, in the wrong, place, can make all the dif-fer-rence in the world..." - GMan, Half-Life 2

A blog of my SEGA Megadrive development adventures: http://www.bigevilcorporation.co.uk

Nothing wrong with gating at 30ms. How did you implement your timer? If you used Sleep, you fail [smile]

Post the skeleton of your update loop

-me
lol I didn't use sleep

#include <stdlib.h>#include <stdarg.h>#include <stdio.h>#include <math.h>#include <string> //might be able to remove later - here for debugging#include "Game.h"using namespace std;Game * game;void init(void){	Map::initializeMap();	glDisable(GL_DEPTH_TEST); // 2D so don't want depth	glMatrixMode(GL_PROJECTION);	glLoadIdentity();	gluPerspective(45.0, 1.0, 1.0, 400.0);	glTranslatef(-10.0, -10.0, -30.0); // center map	glMatrixMode(GL_MODELVIEW);}void displayScene(){	glClear(GL_COLOR_BUFFER_BIT);	glMatrixMode(GL_MODELVIEW);	glLoadIdentity();	glPushMatrix();	Map::drawMap();	game->drawCharacters();	glPopMatrix();	glFlush();}void Timer(int extra){	game->moveCharacters();	glutTimerFunc(30,Timer,0);}void update(){	glutPostRedisplay();}void arrow_keys(int a_keys, int x, int y){	game->movePacman(a_keys);}int main(int argc, char **argv) {	glutInit(&argc, argv);            // initialize GLUT	glutInitDisplayMode(GLUT_RGBA);   // set RGBA display mode	glutInitWindowPosition(325, 0);	glutInitWindowSize(600, 600);	glutCreateWindow("Pacman");	init();	game = new Game();	srand((unsigned)time(0)); //for random number generation	glutDisplayFunc(displayScene); // set the rendering function	glutTimerFunc(0,Timer,0);	glutSpecialFunc(arrow_keys);	glutIdleFunc(update);	glutMainLoop(); // process events and wait until exit	return 0;}
If that GLUT timer function is a callback then there is probably no guarantee it actually gets called every 30ms. Be much better off calculating the time between frames and using that delta to modifiy how much your ghosts move in a frame.
Alright I just did some reading on what you guys are talking about and it makes total sense.

I'm going to use the information provided at: http://hdrlab.org.nz/frame-rate-independent-animation-using-glut/assets/MiniGL-templates/GLUT-timertick-template.lha?PHPSESSID=9faf743b91d9116f37e833350981735a and I'll let you know the results.

Thanks!
Cool. And also, not that you really need to worry for this project, but glut is pretty outdated and most people nowadays recommend that you use something like SDL instead. Something to consider for your next project.

-me
Yeah I actually didn't know GLUT was outdated until halfway into this project. The only graphics course offered at my university teaches openGL using GLUT so it was all I knew. I'll definitely take your advice and move with the times for future projects :P

Good news!

I didn't even have to finish reading the article and I was able to solve my problem by modifying two lines of code! The solution:
1. Cut glutPostRedisplay(); from the 'update' function and paste it into the 'Timer' function.
2. Delete line //glutIdleFunc(update);
3. Delete function update

I suppose this makes sense. Maybe I'm wrong but I'm guessing that the difference is now that the display is now updated every 30ms (constant time) according to the system clock as opposed to an idle function whose call frequency can vary with the CPU load.

Thanks to all!

This topic is closed to new replies.

Advertisement