void Display()
{
UseFixedFunction();
RestoreDefaultFramebuffer();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
if (gGlow)
{
gFBO->Bind();
}
RenderScene();
if (gGlow)
{
UseFixedFunction();
RestoreDefaultFramebuffer();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
Blur();
RestoreDefaultFramebuffer();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
RenderScene();
UseFixedFunction();
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glColor4f(1,1,1,0.9);
SetupOrtho();
if (gBigGlow) glScalef(1.05, 1.05, 1.0);
glBindTexture(GL_TEXTURE_2D, gFBOTex);
glEnable(GL_TEXTURE_2D);
glBegin(GL_QUADS);
glTexCoord2f(0,0);
glVertex3f(-1,-1,0);
glTexCoord2f(1,0);
glVertex3f(1,-1,0);
glTexCoord2f(1,1);
glVertex3f(1,1,0);
glTexCoord2f(0,1);
glVertex3f(-1,1,0);
glEnd();
glDisable(GL_TEXTURE_2D);
glDisable(GL_BLEND);
}
glutSwapBuffers();
}
Crazy Anomaly
Take a look at this code:
When gGlow is false (that is, there is no glow), the scene renders at a little less than 60 FPS (which already confuses me, since the scene is really simple). When gGlow is true the FPS nearly doubles. As in, it's a noticeable difference (my scene rotates a lot faster). Mind you, my Blur function uses a shader to blur a full-screen quad 30 times.
This... doesn't seem possible to me. Does anyone have any clue what's going on?
Is RenderScene() truly invariant? If it increases the time, does incremental stuff or other, the rendering will look twice as fast as it is called twice during one frame.
The 60fps is probably because you wait on the VSync during the SwapBuffers.
The 60fps is probably because you wait on the VSync during the SwapBuffers.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement