• ### Announcements

#### Archived

This topic is now archived and is closed to further replies.

# Texture Mapping Not Showing

## Recommended Posts

##### Share on other sites
Btw, the nt for the function return type is actually int, just a typo.

##### Share on other sites
Switching the image to a 512x512 bitmap seems to have resolved this issue. Could anyone explain to me why this is the case? Just would like to know for future reference. Thanks!

##### Share on other sites
Are you using double-buffering?
If so, you need to stick SwapBuffers(wglGetCurrentDC());
as the last line in your rendering function.

##### Share on other sites
Opengl can only use textures that have a width and height that are both a power of 2.

##### Share on other sites
Unless you use advanced extensions, all textures need to have widths and heights that are power of two (plus possibly two pixels for border). glTexImage2D() will return an error when you try to allocate another width or height, so saying you didn''t get any errors seems to indicate that you didn''t correctly check for them.

When I program OpenGL, I put this code after EACH OpenGL call:

assert( !glGetError() );

I also #include <assert.h> at the top of my source files. This ensures that, if I do something wrong, I''m told about it as soon as possible.

The width and height don''t need to be the same, but they each need to be taken from the following set:

1, 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024 (possibly 2048 or 4096 for very new cards)

You can test that a width/height is OK by using

bool IsTextureWidthAndHeightOk( int width, int height ){  return !(width & (width-1)) && !(height & (height-1));}

##### Share on other sites
Thanks for the clarification as well as the error checking routines. Hopefully one day I will be able to give something back to the community. Thanks again for everything.

• ### Forum Statistics

• Total Topics
627700
• Total Posts
2978695

• 21
• 14
• 12
• 10
• 12