Archived

This topic is now archived and is closed to further replies.

Simple Texture Question

This topic is 5648 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am creating an opengl texture out of a array of char''s. When I render the image using a quad, all I get is a white quad, not a textured one. If I use glDrawPixels, then its ok. Any thoughts about why the texture is not showing up....??? Thanks. Here are the creation & render snippets: // generate a 2d texture image from an arary pBitmap->w = (int)array[0]; // first 2 entries hold width & height. pBitmap->h = (int)array[1]; pBitmap->pixels = (unsigned char*)&array[2]; glGenTextures( 1, &(pBitmap->texId) ); glBindTexture(GL_TEXTURE_2D, pBitmap->texId ); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB, pBitmap->w, pBitmap->h, 0, GL_RGB, GL_UNSIGNED_BYTE, pBitmap->pixels ); // draw a textured quad glEnable( GL_TEXTURE_2D ); glBindTexture( GL_TEXTURE_2D, pBitmap->texId ); glColor4f( 1.0f, 1.0f, 1.0f, 1.0f ); glBegin( GL_QUADS); // Bottom Left glTexCoord2f( 0, 0 ); glVertex2i( x, y ); // Top left glTexCoord2f( 0, 1 ); glVertex2i( x, y+h ); // Top right glTexCoord2f( 1, 1 ); glVertex2i( x+w, y+h ); // Bottom right glTexCoord2f( 1 , 0 ); glVertex2i( x+w, y ); glEnd(); glDisable( GL_TEXTURE_2D );

Share this post


Link to post
Share on other sites
I honestly do not mean to seem crude or patronising, but I can see from your code that the first problem is with your C++ semantics, for example, what is

pBitmap->pixels = (unsigned char*)&array[2];

supposed to be doing? You appear to be casting the address of the array element to a pointer to unsigned char and storing that value in the pixels field of pBitmap? Is this giving you any warnings?

Regards,
Mathematix.

[edited by - mathematix on June 24, 2002 4:59:17 PM]

Share this post


Link to post
Share on other sites
the (unsigned char*) bit is really unnecessary the array[] is already an array of unsigned char''s that contains the pixel data (RGB); prepended with the width & height.

So:
array[0] is the width.
array[1] is the height

array[2] is the first pixel''s Red component
array[3] is the green
array[4] is the blue
....

casting the address of array[2] to be a pointer to a unsigned char is valid syntax.

The code that does work looks like this:
( x, y,& pBitmap are the passed in arguments )

glRasterPos2f( (float)x+.5, (float)y + (float)pBitmap->h + .5);
glPixelStorei( GL_UNPACK_ALIGNMENT, 1 );
glDrawPixels( pBitmap->w, pBitmap->h, GL_RGB, GL_UNSIGNED_BYTE, pBitmap->pixels );

I want to be able to scale the images, so drawing a quad is a little more flexible that drawing the pixels directly.



Share this post


Link to post
Share on other sites
The images are 64x64 tga's. I am using NeHe's TGA loading routing to load the textures. None of the opengl commangs are returning any errors. Everything looks like its going ok during the texture setup phase.

Are there any gl enable/disble commands I can try to help determine the issue?

Thanks.

btw, I switched from the built-in arrays to the tga's to see if that helped.
But, I get the same thing.

[edited by - maxsteel on June 25, 2002 2:25:50 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Are both of your snippets in the same function? More specifically, are they at the same scope in regards to pBitmap? Make sure you''re not using two completely different pBitmaps, so that when it comes time to draw the quad you''re pointing to a completely different sot.

Share this post


Link to post
Share on other sites