texture problem

Started by
6 comments, last by gbook2 16 years, 3 months ago
I'm drawing a 2D texture and am having some difficulty getting the correct result. I'm trying to draw a checkerboard (or anything actually) using the following code, and all I get is a mix of colors... so the code below shows up as a big gray square. If I change the values, it seems to show an average of the colors, but I don't see the individual pixels I'm expecting. int w = canvasSize.GetWidth(); int h = canvasSize.GetHeight(); GLfloat size_ = 1.0; GLfloat aspect = (GLfloat) h/(GLfloat) w; /* setup viewport, etc */ glViewport(0, 0, (GLsizei) w, (GLsizei) h); glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(-size_, size_, -size_*aspect, size_*aspect, -size_, size_); glMatrixMode(GL_MODELVIEW); glEnable(GL_TEXTURE_2D); glEnable(GL_BLEND); glShadeModel(GL_SMOOTH); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); unsigned char buffer[16]; buffer[0] = 255; buffer[1] = 0; buffer[2] = 255; buffer[3] = 0; buffer[4] = 255; buffer[5] = 0; buffer[6] = 255; buffer[7] = 0; buffer[8] = 255; buffer[9] = 0; buffer[10] = 255; buffer[11] = 0; buffer[12] = 255; buffer[13] = 0; buffer[14] = 255; buffer[15] = 0; glGenTextures(1, &renderTexture); glBindTexture(GL_TEXTURE_2D,renderTexture); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE16, 4, 4, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, buffer); /* bind the textures */ glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D,renderTexture); glBegin(GL_QUADS); glTexCoord2f(0.0, 0.0); glVertex3f(-0.5,-0.5,-0.5); glTexCoord2f(1.0, 0.0); glVertex3f(0.5,-0.5,-0.5); glTexCoord2f(1.0, 1.0); glVertex3f(0.5,0.5,-0.5); glTexCoord2f(0.0, 1.0); glVertex3f(-0.5,0.5,-0.5); glEnd(); glDisable(GL_TEXTURE_2D); What might be going on?? -Greg
Advertisement
Since your buffer is only 16 bytes long, your 2D image is actually only 2x2. So, you will want to change the image width and height from 4x4 to 2x2.

Change your luminance formats to RGBA formats. Luminance, I believe, provides a grayscale image.

Since you set the texture to repeat, you can provide values greater than 1 to get the checkerboard look you are wanting.

Hope this helps.
I do want a grayscale image, so GL_LUMINANCE is what I want. In that case, the sizes should be ok at 4x4 = 16 pixels. I've done this before in another part of the program, but I must be missing something when I copied the code. What is missing is the question...
I'm wondering what GL commands (or lack of commands) would cause the texture to be a solid color.
The only other thing I could think of is if you have lighting turned on and no source of light. The code that you have looks fine.
As Deception666 said, GL_LUMINANCE16 specifies that your data is a 16-bit grayscale. You specify your image using 16 bytes, that is 8 words (16-bit). 8 pixels does not match a 4*4 image (16 pixels). Instead, make a 2*2 image (4 pixels)

Change your data to this:
unsigned char buffer [8] = { 255, 255, 0, 0, 0, 0, 255, 255 };


This will be interpreted as 4 pixels, 0xFFFF, 0x0, 0xFFFF and 0x0

Then replace
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE16, 4, 4, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, buffer);


with:
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE16, 2, 2, 0, GL_LUMINANCE, GL_UNSIGNED_SHORT, buffer);


I haven't tried this code so i don't know if i missed something.
Quote:Original post by gardin
As Deception666 said, GL_LUMINANCE16 specifies that your data is a 16-bit grayscale. You specify your image using 16 bytes, that is 8 words (16-bit). 8 pixels does not match a 4*4 image (16 pixels). Instead, make a 2*2 image (4 pixels)

You're mixing internal and external data format. He's specifying GL_LUMINANCE16 for INTERNAL format, which tells OpenGL to store the data internally as 16 bits. Source data is specified in the last three parameters. He's passing an array of luminance data, consisting of one byte per texel, so GL_LUMINANCE and GL_UNSIGNED_BYTE are the correct parameters.
To disable color averaging for such small textures, use GL_NEAREST instead of GL_LINEAR for magnification filter:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

Well, it turns out there were a couple of problems. First was that I needed to add this line before the final texture binding:

glActiveTexture(GL_TEXTURE0);

And I needed to delete the texture at the end of the drawing:

glActiveTexture(GL_TEXTURE0);
glDeleteTextures(1,&renderTexture);

The internal texture storage format and texture datatype do not matter much if they are different. Internal format only describes how OpenGL should STORE the data in the texture, whereas the datatype describes what datatype you are putting INTO the texture.
So if I specified GL_RGB as the internal format, I could still have a texture datatype of GL_LUMINANCE. For this example, if I had a 4 by 4 array of bytes, with them each being a luminance value, and an internal format of RGB; red, green, and blue values would all be equal. It would just store the luminance value in each color.
Internal texture formats DO become important if you are using textures in a fragment shader, or if your internal format has less precision than your texture datatype.

This topic is closed to new replies.

Advertisement