Sign in to follow this  
jorgander

OpenGL 1D/2D/3D textures and which one "wins"

Recommended Posts

I have a problem with textures of different numbers of dimensions and which one OpenGL is choosing to apply. At the start of my program 2 textures are loaded, one 1D and one 2D. The relevant OpenGL calls I make before rendering my objects are glEnable(GL_TEXTURE_1D), glEnable(GL_TEXTURE_2D), glTexCoordPointer(1, GL_FLOAT, ...), and glBindTexture(GL_TEXTURE_1D, ...). So, pseduo code for the whole thing is
LoadTexture1D();
LoadTexture2D();
...
glEnable(GL_TEXTURE_1D);
glEnable(GL_TEXTURE_2D);
...
glBindTexture(GL_TEXTURE_1D, 1Dtexture);
glTexCoordPointer(1, GL_FLOAT, vsize, offset);
...
RenderMesh();


Here is part of the code in LoadTextureND();
...
glGenTextures(1, (unsigned int *)(&(p_clsTexture.m_ptrObject)));
if ( uintHeight > 1 )
{
    p_clsTexture.m_uchrDimensions = 2;
    glBindTexture(GL_TEXTURE_2D, (unsigned int)(p_clsTexture.m_ptrObject));
    if ( glIsTexture((unsigned int)(p_clsTexture.m_ptrObject)) != GL_TRUE )
    {
        glDeleteTextures(1, (unsigned int *)(&(p_clsTexture.m_ptrObject)));
        delete [] ptrTextureData;
        return false;
    }

    GLenum enmFormats[] = {0, GL_ALPHA, 0, GL_BGR_EXT, GL_BGRA_EXT};
    glTexImage2D(GL_TEXTURE_2D,
                 0,
                 (int)(ushtChannels),
                 uintWidth,
                 uintHeight,
                 0,
                 enmFormats[ushtChannels],
                 GL_UNSIGNED_BYTE,
                 ptrTextureData);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
}
else
{
    p_clsTexture.m_uchrDimensions = 1;
    glBindTexture(GL_TEXTURE_1D, (unsigned int)(p_clsTexture.m_ptrObject));
    if ( glIsTexture((unsigned int)(p_clsTexture.m_ptrObject)) != GL_TRUE )
    {
        glDeleteTextures(1, (unsigned int *)(&(p_clsTexture.m_ptrObject)));
        delete [] ptrTextureData;
        return false;
    }

    GLenum enmFormats[] = {0, GL_ALPHA, 0, GL_BGR_EXT, GL_BGRA_EXT};
    glTexImage1D(GL_TEXTURE_1D,
                 0,
                 (int)(ushtChannels),
                 uintWidth,
                 0,
                 enmFormats[ushtChannels],
                 GL_UNSIGNED_BYTE,
                 ptrTextureData);
    glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
}
...


My question is, how come the 2D texture is applied to the mesh even though I have the 1D texture bound (which implies the 2D texture is NOT bound), and the VBO has 1 texture coordinate? The only way I can get it to apply the correct texture is if I explicitly disable 2D with glDisable(GL_TEXTURE_2D). But why should I have to do that?

Share this post


Link to post
Share on other sites
From Section 3.8.16 - Texture Application of the OpenGL Specs
Quote:
Texturing is enabled or disabled using the generic Enable and Disable commands,
respectively, with the symbolic constants TEXTURE_1D, TEXTURE_2D,
TEXTURE_3D, or TEXTURE_CUBE_MAP to enable the one-, two, three-dimensional,
or cube map texture, respectively. If both two- and one-dimensional textures are
enabled, the two-dimensional texture is used. If the three-dimensional and either
of the two- or one-dimensional textures is enabled, the three-dimensional texture
is used. If the cube map texture and any of the three-, two-, or one-dimensional
textures is enabled, then cube map texturing is used.


EDIT:
Quote:
Originally posted by jorgander
My question is, how come the 2D texture is applied to the mesh even though I have the 1D texture bound (which implies the 2D texture is NOT bound)
That implies only that the 1D texture target binding was changed from the default 1D texture.

There's always a texture bound to each texture target; by default these are the default 1D, 2D, 3D, and cubemap textures (all of which are treated as texture objects with texture name 0). If you bind a new 1D texture object to the 1D target and a 2D texture object to the 2D target, the default 3D and cubemap textures are still bound to their respective targets. If you then change the 1D texture target binding again, the 2D texture target still has the same 2D texture object bound to it as it previously did.

Now, with the above quote from the specs, you can see why binding a 2D texture to the 2D target then binding a 1D texture to the 1D target, and while having both 1D and 2D texturing enabled, will still use the 2D texture object.

[Edited by - Kalidor on March 6, 2008 12:17:27 PM]

Share this post


Link to post
Share on other sites
Thanks, that helps clear it up a little. It's still confusing to me though. As I understood it, aside from multi-texturing OpenGL maintains one active texture, the bound texture, set with glBindTexture. Any time you call glBindTexture it replaces the currently bound texture. What I'm thinking now is that OpenGL maintains a currently bound texture per dimension, which explains why you can do things like:

glEnable(GL_TEXTURE_2D);
glEnable(GL_TEXTURE_1D);
glBindTexture(GL_TEXTURE_2D, ...);
glBindTexture(GL_TEXTURE_1D, ...);
...
renderMesh; (<- 2D texture is applied here)

EDIT:

I was posting this reply at the same time you were editting your post. I think I understand now, thanks for the reply.

Share this post


Link to post
Share on other sites
the order is 3d->cubemap->2d->1d

thus if u have a cubemap+2d enabled + bound to the same unit the cubemap will be used
in glsl u can specify what texture u use ( glEnable GL_TEXTURE_XXX is ignored anyways )

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this