# OpenGL unexpected texture aliasing

This topic is 4207 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

It took me a few tries to figure out some issues with the order in which to call the gl functions for adding texture functionality to my program, and now i've run into a texturing roadblock. The texture i designed is a 640x480 texture, as is the opengl/glut window i'm using and the rectangle(quad) onto which the texture is being mapped. However, when the buffer is output to the screen, the texture becomes aliased such that the 1:1 pixel correspondence is no longer preserved. I've tried using the different glTexParameteri options to try and resolve theis problem, but the changes in the amount of aliasing are minimal (if any different). Short of rendering the screen pixel by pixel, specifying each color manually (which i'd rather avoid doing if possible), how could i overcome this situation?

##### Share on other sites
Is the client area 640x480, or just the window? A 640x480 window includes the title bar and the borders.

##### Share on other sites
The client area of the window is 640x480, even though i'm creating it with a call to glutInitWindowSize(640,480), so it seems this function doesn't include the title bar and border in the specified size. one thing i did realize though is that the rectangle i'm rendering the texture to isn't actually 640x480, which may be resulting in unnecessary scaling of the texture. I had been using the following code in order to make the rectangle the appropriate size for the screen, when displayed depth units into the screen from the camera.
  double depth = 1.5;  double AR = double(width)/height;  double viewangle = camera-&gt;getAngle();  double hW = depth*tan(viewangle/2);  double hH = (depth/AR)*tan(viewangle/2);  printf( "hw: %f\thh: %f\n", hW, hH );  glColor3f(1,1,1);  tm-&gt;UseTexture(0);  glTranslated(0,0,-depth);   glBegin( GL_QUADS );     glTexCoord2d(0,0); glVertex3d( -hW, -hH, 0 );     glTexCoord2d(1,0); glVertex3d(  hW, -hH, 0 );     glTexCoord2d(1,1); glVertex3d(  hW,  hH, 0 );     glTexCoord2d(0,1); glVertex3d( -hW,  hH, 0 );   glEnd();  glTranslated(0,0,depth);

The above code results in a rectangle that's approximately .26x.16, which would be some pretty drastic scaling. To reduce the amount of scaling, I tried converting the code to the following:
  double depth = 1.5;  double AR = double(width)/height;  double viewangle = camera-&gt;getAngle();  double hW = depth*tan(viewangle/2);  double hH = (depth/AR)*tan(viewangle/2);  printf( "hw: %f\thh: %f\n", hW, hH );  glColor3f(1,1,1);  tm-&gt;UseTexture(0);  glTranslated(0,0,-depth);  glScalef((2*hW)/640,(2*hH)/400,1.0);   glBegin( GL_QUADS );     glTexCoord2d(0,0); glVertex3d( -320, -200, 0 );     glTexCoord2d(1,0); glVertex3d(  320, -200, 0 );     glTexCoord2d(1,1); glVertex3d(  320,  200, 0 );     glTexCoord2d(0,1); glVertex3d( -320,  200, 0 );   glEnd();  glTranslated(0,0,depth);

However, this change didn't resolve the problem at all, the aliasing still remains. Any possibilities i haven't thought of?

##### Share on other sites
The size of your quad in object or world space wouldn't affect the appearance of the texture, only the size in screen space, which I think in your two cases is exactly the same (ie: full screen). And since you're just rendering a full-screen quad, it's much easier to skip that math and just use a projection matrix that will let you specify your vertices either in normalized device coordinates ([-1, 1]) or in screen coordinates ([0, screenSize]) like this...

//To specify the quad in normalized device coordinates, just set the projection and modelview matrices to identityglMatrixMode(GL_PROJECTION);glPushMatrix();glLoadIdentity();glMatrixMode(GL_MODELVIEW);glPushMatrix();glLoadIdentity();//Now draw the quad from (-1,-1) to (1,1)glBegin(GL_QUADS);     glTexCoord2f(0.0f, 0.0f);     glVertex2f(-1.0f, -1.0f);     glTexCoord2f(1.0f, 0.0f);     glVertex2f(1.0f, -1.0f);     glTexCoord2f(1.0f, 1.0f);     glVertex2f(1.0f, 1.0f);     glTexCoord2f(0.0f, 1.0f);     glVertex2f(-1.0f, 1.0f);glEnd();//Remember to pop the projection and modelview matrices again to restore them to what they were previously//To specify the quad in screen coordinates, it's very similar except you use an orthographic projection matrix with the same dimensions as the screenglMatrixMode(GL_PROJECTION);glPushMatrix();glLoadIdentity();glOrtho(0.0, screenWidth, 0.0, screenHeight, -1.0, 1.0);//The rest is the same except the quad is now drawn from (0,0) to (screenWidth, screenHeight)

As for the aliasing problem... how are you creating your texture? Are you using gluBuild2DMipmaps? That internally resizes the image data you send it to the nearest power-of-two dimensions (512x512 in this case), most likely using a very simple box filter, so the quality will certainly degrade from the original image.

##### Share on other sites
As for rendering methods for the quad itself, the various options you mentioned don't seem to make too much of a difference with regards to fixing the problem, but thanks anyway - i now know some more convenient ways to do some rendering to the screen. i did happen to be using gluBuild2DMipmaps in order to create the texture, and after changing the implementation to use glTexImage2D instead, the texture is being rendered without any of the odd aliasing artifacts. Thanks for your input on this.
~Sqeezy

1. 1
Rutin
67
2. 2
3. 3
4. 4
5. 5

• 21
• 10
• 33
• 20
• 9
• ### Forum Statistics

• Total Topics
633418
• Total Posts
3011785
• ### Who's Online (See full list)

There are no registered users currently online

×

## Important Information

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!