Rendering efficient 2D sprites in OpenGL using Texture Rectangles

Published November 02, 2007 by Brandon Fleming, posted by Myopic Rhino
Do you see issues with this article? Let us know.
Advertisement

Ever wondered how to make a 2D game using OpenGL? Ever wanted to write an efficient and easy to use sprite engine via OpenGL and have the entire functionality of Direct3D's ID3DXSprite interface? Hopefully my article can give you a hand with your questions and endeavors. I've read and heard many requests on how to write 2D games using the OpenGL API and I too hesitated to start writing 2D games with OpenGL. Then, I thought of a great implementation as to how to do it fast, efficiently and with the same functionality Microsoft's Direct3D would give you using texture rectangles. Before I begin, I want to explain some of the benefits that come along with using this technique:

  • No need to use glDrawPixels, glCopyPixels, etc. - Better image quality than those.
  • Textures do not need to be square or power of 2 - Using a non power of 2 texture using GL_TEXTURE_2D can impact performance on many video cards. Of course you could always use gluBuild2DMipmaps to solve this problem, but it uses up more video memory than what is really needed.
  • Actual bitmap dimensions can be used. - It's much easier to use the actual bitmap dimensions then to calculate them from the (s,t) coordinates and saves you time by eliminating the need to do such calculations.
I'm assuming that the reader has a good understanding on how OpenGL works because I'm not going to get detailed on things that do not pertain to the subject here to keep this article as simplified as possible so that anyone can learn. So if you're ready, let's dive into the code :) The first thing we want to do is setup our viewports and projections for 2D rendering. The code I used is a modified version of its original written by Dwarf with Axe here on GameDev.net:

//-----------------------------------------------------------------------------
// Name: glEnable2D
// Desc: Enabled 2D primitive rendering by setting up the appropriate orthographic
//               perspectives and matrices.
//-----------------------------------------------------------------------------
void glEnable2D( void )
{
        GLint iViewport[4];

        // Get a copy of the viewport
        glGetIntegerv( GL_VIEWPORT, iViewport );

        // Save a copy of the projection matrix so that we can restore it 
        // when it's time to do 3D rendering again.
        glMatrixMode( GL_PROJECTION );
        glPushMatrix();
        glLoadIdentity();

        // Set up the orthographic projection
        glOrtho( iViewport[0], iViewport[0]+iViewport[2],
                         iViewport[1]+iViewport[3], iViewport[1], -1, 1 );
        glMatrixMode( GL_MODELVIEW );
        glPushMatrix();
        glLoadIdentity();

        // Make sure depth testing and lighting are disabled for 2D rendering until
        // we are finished rendering in 2D
        glPushAttrib( GL_DEPTH_BUFFER_BIT | GL_LIGHTING_BIT );
        glDisable( GL_DEPTH_TEST );
        glDisable( GL_LIGHTING );
}


//-----------------------------------------------------------------------------
// Name: glDisable2D
// Desc: Disables 2D rendering and restores the previous matrix and render states
//               before they were modified.
//-----------------------------------------------------------------------------
void glDisable2D( void )
{
        glPopAttrib();
        glMatrixMode( GL_PROJECTION );
        glPopMatrix();
        glMatrixMode( GL_MODELVIEW );
        glPopMatrix();
}
To use these functions, simply call glEnable2D() before you start rendering 2D and after you do any 3D rendering (if any). Then when you are done with your 2D rendering, simply call glDisable2D() before you render the final scene or frame. Now that we have our 2D rendering code set up, we need to check to see if texture rectangles are supported. I personally use GL_NV_texture_rectangle, but you can use GL_EXT_texture_rectangle if your video card doesn't support the NVIDIA extension. I use the NVIDIA extension because at the time of writing, I'm running an NVIDIA GeForce 6600 (256MB, PCI-E), but either extension should work fine. Also, make sure you have the latest version of glext.h if you need it. Code? Okay...

//-----------------------------------------------------------------------------
// Name: InitScene
// Desc: Initializes extensions, textures, render states, etc. before rendering
//-----------------------------------------------------------------------------
int InitScene( void )
{
        // Is the extension supported on this driver/card?
        if( !glh_extension_supported( "GL_NV_texture_rectangle" ) )
        {
                printf( "ERROR: Texture rectangles not supported on this video card!" );
                Sleep(2000);
                exit(-1);
        }

        // NOTE: If your comp doesn't support GL_NV_texture_rectangle, you can try
        // using GL_EXT_texture_rectangle if you want, it should work fine.

        // Disable lighting
        glDisable( GL_LIGHTING );

        // Disable dithering
        glDisable( GL_DITHER );

        // Disable blending (for now)
        glDisable( GL_BLEND );

        // Disable depth testing
        glDisable( GL_DEPTH_TEST );

        return LoadSpriteTexture();
}
The function glh_extension_supported() is coming from the NVIDIA SDK v9.5 (see glh_extensions.h). I was too lazy to write my own, plus I'm assuming you all should know how to check extensions anyway :) Moving on, now that we have determined that this video card and driver supports hardware accelerated texture rectangles, we can continue on by loading our texture(s). Be sure to enable the token(s) GL_TEXTURE_RECTANGLE_NV or GL_TEXTURE_RECTANGLE_EXT, from whichever extension you are using or else the texture will not show. From then on, simply replace GL_TEXTURE_2D with the token matching the extension you're using.

// Enable the texture rectangle extension
glEnable( GL_TEXTURE_RECTANGLE_NV );

// Generate one texture ID
glGenTextures( 1, &g_uTextureID );
// Bind the texture using GL_TEXTURE_RECTANGLE_NV
glBindTexture( GL_TEXTURE_RECTANGLE_NV, g_uTextureID );
// Enable bilinear filtering on this texture
glTexParameteri( GL_TEXTURE_RECTANGLE_NV, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_RECTANGLE_NV, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

// Write the 32-bit RGBA texture buffer to video memory
glTexImage2D( GL_TEXTURE_RECTANGLE_NV, 0, GL_RGBA, pTexture_RGB->sizeX, pTexture_RGB->sizeY,
                          0, GL_RGBA, GL_UNSIGNED_BYTE, pTexture_RGBA );

// Save a copy of the texture's dimensions for later use
g_iTextureWidth = pTexture_RGB->sizeX;
g_iTextureHeight = pTexture_RGB->sizeY;
Now that we have loaded the texture, we can render it to a primitive such as a quad. This is the easiest part, but be careful not to blit your texture upside down, because even though you're using the bitmap dimensions, you're still using the (s,t) format so you're starting from the bottom left-hand corner.

// Enable 2D rendering
glEnable2D();
        
// Make the sprite 2 times bigger (optional)
glScalef( 2.0f, 2.0f, 0.0f );

// Blend the color key into oblivion! (optional)
glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );

// Set the primitive color to white
glColor3f( 1.0f, 1.0f, 1.0f );
// Bind the texture to the polygons
glBindTexture( GL_TEXTURE_RECTANGLE_NV, g_uTextureID );

// Render a quad
// Instead of the using (s,t) coordinates, with the  GL_NV_texture_rectangle
// extension, you need to use the actual dimensions of the texture.
// This makes using 2D sprites for games and emulators much easier now
// that you won't have to convert :)
glBegin( GL_QUADS );
 glTexCoord2i( 0, g_iTextureHeight );                           
 glVertex2i( 0, 0 );
 glTexCoord2i( g_iTextureWidth, g_iTextureHeight );     
 glVertex2i( g_iTextureWidth, 0 );
 glTexCoord2i( g_iTextureWidth, 0 );    
 glVertex2i( g_iTextureWidth, g_iTextureHeight );
 glTexCoord2i( 0, 0 );          
 glVertex2i( 0, g_iTextureHeight );
glEnd();

// Disable 2D rendering
glDisable2D();
Now give this code a spin and you'll get something like this:

And there you have it! You can now render 2D sprites as efficiently as you would in Direct3D! Sure it's a little more code, but in the end it works! Of course, you could always make your own C++ wrapper classes or stuctures, but I wanted to keep things simplified for educational purposes. I'm also sure that there are many ways this code can be improved, but you can implement improvements as you see fit to your own code. If you have any questions or suggestions about this, please e-mail me at blueshogun96@gmail.com. Happy coding!

References:

Last Post about 2D in OpenGL
NVIDIA SDK

Cancel Save
0 Likes 0 Comments

Comments

Nobody has left a comment. You can be the first!
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!

Working on a 2D game using OpenGL? Want to have the same functionality of Direct3D's sprite interface (ID3DXSprite)? Learn to use texture rectangles using GL_NV_texture_rectangle, GL_EXT_texture_rectangle or GL_ARB_texture_rectangle and make things easier

Advertisement
Advertisement