SDL: eliminate calls to gluPerspective()

This topic is 3965 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Hi all, The following code was constructed by copying and pasting bits of my game engine into a single function, in order to simplify my question. It operates as expected: a red quadrilateral is drawn with the camera moving up away from it at just under 100fps:

if (SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER | SDL_INIT_JOYSTICK) < 0) {
return 0;
}
int parameters = SDL_HWSURFACE|SDL_DOUBLEBUF|SDL_OPENGL;
if(F)
parameters += SDL_FULLSCREEN;

main_screen = SDL_SetVideoMode(X, Y, B, SDL_HWSURFACE|SDL_DOUBLEBUF|SDL_OPENGL);
x=X; y=Y; b=B; f=F;

glMatrixMode(GL_MODELVIEW);
gluPerspective(X,(X/Y),1,20);

for(float i = 0; i < 20; i+=0.1){

gluPerspective(X,(X/Y),1,20);
gluLookAt(0,1+i,0,0,0,2,0,1,0);

glColor3f(1,0,0);
glClearColor(0,0,0,0);
glClear(GL_COLOR_BUFFER_BIT);
glBegin(GL_POLYGON);
glVertex3f(0.25,0.25,2);
glVertex3f(0.75,0.25,2);
glVertex3f(0.75,0.75,2);;
glVertex3f(0.25,0.75,2);
glEnd();
glFlush();
SDL_GL_SwapBuffers();
SDL_Delay(10);
};

SDL_Quit(SDL_QUIT_ALL);

return 1;


My question is this: I writing my own camera class so that i can GPL the game engine some time in the future. I wish to eliminate the repeated calls to glupPerspective(), so that my camera class doesnt need to know the reolution of the screen. How best to do this? p.s. pleae excuse the typos, ive been up for 28 hours and refuse to go to sleep until i finish this camera class.

Share on other sites
You only need to call gluPerspective() once. Normally you set the current matrix to the projection matrix when you call gluPerspective(), and then switch to the modelview matrix and don't touch the perspective matrix again.

Share on other sites

The original problem that caused me to add gluPerspective() to the camera update methods was something to do with me using the wrong matrix mode, but now it all fits into place. Ive changed it and it now works.

Share on other sites
If you happen to need gluPerspective this might be interesting.

Share on other sites
That is interesting, but has anybody tested it to see if its more efficient than gluPersective? Ill read it to better understand how perspective correction works.

Share on other sites
You should be able to google gluPerspective and find the exact equation that's going on behind the scenes. You can then build your perspective matrix yourself without the need for glut.
That's what I did for my engine.

Share on other sites
Im no mathematican, so I probably wont understand the equationn anyway. Quaternions are as complicated as my knowledge goes.

is it worth my spending some time learning this? im happy with how gluperspective works, are there any new tricks that glu cant do? My game engine is a personal project so im always looking for a new learning opportunity, but id like to get something working to show people as soon as possible. My friends joke about what im doing when i spend four hours in front of my computer screen.

Share on other sites
There's nothing groundbreaking here. gluPerspective is just a shortcut to setting up the view frustrum a certain way. However you'll see from the link posted above that the actual resolution isn't important, just the field of view in degrees. There won't be any efficiency difference for the 2 methods, as all you're doing is setting up a few parameters for the graphics system. If you've solved your original problem, don't worry about it.

Share on other sites
Quote:
 Original post by speciesUnknownThat is interesting, but has anybody tested it to see if its more efficient than gluPersective? Ill read it to better understand how perspective correction works.

Since all it does is 2 trig calls and 5 multiplies, it really couldn't be that inefficient. It works the same way as gluPerspective (by calling glFrustum), so no problems there.

Also, in your code snippet there you should probably be placing the perspective matrix in glMatrixMode(GL_PROJECTION), rather than directly in the GL_MODEL_VIEW. AFAIK, gl makes some optimisation assumptions based on the 2 matrix modes, and if you mix them you may adversely affect performance.

• 9
• 10
• 12
• 10
• 10