Jump to content
  • Advertisement

blackirishman

Member
  • Content Count

    6
  • Joined

  • Last visited

Community Reputation

108 Neutral

About blackirishman

  • Rank
    Newbie
  1. blackirishman

    Texture2D Texture is not properly displaying

    I have located my problem to how my shader loading code works. I used the OpenGL SuperBible tool kit to load the same shaders and that worked. So I will update when I know more.
  2. blackirishman

    Texture2D Texture is not properly displaying

    Keeping GL_TEXTURE_2D had no effect.
  3. I am passing two textures to a shader to perform some operations on the images. Right now my textures appear as flat colors. I was thinking that my texture coordinates shouldn't have been normalized, but that doesn't appear to be the case:   Here is the code I wrote to draw the textures followed  by my code texture generation code:   // Set the current context to the one given to us.     CGLSetCurrentContext(glContext);          //attach textures       textureWidth = 960.0;     textureWidth = 540.0;          glViewport(0, 0, textureWidth, textureWidth);          glMatrixMode(GL_PROJECTION);     glLoadIdentity();     glOrtho(-1, 1, -1 , 1, -20., 20.);       glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);       GLfloat        texMatrix[16]   = {0};     GLint       saveMatrixMode;       texMatrix[0]    = (GLfloat)textureWidth;     texMatrix[5]    = (GLfloat)textureWidth;     texMatrix[10]   = 1.0;     texMatrix[13]   = 1.0;     texMatrix[15]   = 1.0;       glGetIntegerv(GL_MATRIX_MODE, &saveMatrixMode);     glMatrixMode(GL_TEXTURE);     glPushMatrix();     glLoadMatrixf(texMatrix);     glMatrixMode(saveMatrixMode);       glUseProgram(shaderProgram);       GLint iTransform, iTextureUnit0, iTextureUnit1;       iTransform = glGetUniformLocation(shaderProgram, "mvpMatrix");     glUniformMatrix4fv(iTransform, 1, GL_FALSE, texMatrix);       iTextureUnit0 = glGetUniformLocation(shaderProgram, "textureUnit0");     glUniform1i(iTextureUnit0, 0);     glActiveTexture(GL_TEXTURE0);     glBindTexture(GL_TEXTURE_2D, _textureA);       iTextureUnit1 = glGetUniformLocation(shaderProgram, "textureUnit1");     glUniform1i(iTextureUnit1, 1);     glActiveTexture(GL_TEXTURE1);     glBindTexture(GL_TEXTURE_2D, _textureB);            //Draw textured quad     glBegin(GL_QUADS);     glTexCoord2f(0.0f, 1.0f);     glVertex3f(-1.0f,-1.0f, 1.0f);     glTexCoord2f(1.0f, 1.0f);     glVertex3f( 1.0f,-1.0f, 1.0f);     glTexCoord2f(1.0f, 0.0f);     glVertex3f( 1.0f, 1.0f, 1.0f);     glTexCoord2f(0.0f, 0.0f);     glVertex3f(-1.0f, 1.0f, 1.0f);         glEnd();               glGetIntegerv(GL_MATRIX_MODE, &saveMatrixMode);     glMatrixMode(GL_TEXTURE);     glPopMatrix();     glMatrixMode(saveMatrixMode);   Here is my texture code:   glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D, texture); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_BORDER); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_BORDER); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexImage2D(GL_TEXTURE_2D, 0, internalformat, (GLsizei)width, (GLsizei)height, 0, format, type, data); glBindTexture(GL_TEXTURE_2D, 0); glDisable(GL_TEXTURE_2D);    
  4. blackirishman

    Using a 12bit texture

    I successfully loaded my 12bit texture, but one thing is unclear. I am only using 12 bits in a single 16 bit short. I guess I don't understand how the glTextImage2D understands that I want to scale 12bits across 16 bits.   glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_SHORT, imageData);
  5. blackirishman

    Using a 12bit texture

    My format is packed 'XYZ 4:4:4, 36 bpp, (msb) 12X, 12Y, 12Z (lsb), the 2-byte value for each X/Y/Z is stored as little-endian, the 4 lower bits are set to 0'.  As these are unsigned integers, they wold be normalized values, correct?
  6. blackirishman

    Using a 12bit texture

    I want to upload RGB data to a texture. Each component is 12bits in 2 bytes. The 4 lower bits are set to 0. First, is this possible to do? I have been reading the documentation for glTexImage2D. The documentation says that I can specify RGB12 as an internal format, but when I look at the type of data I can specify for the pixel data, I see everything from GL_UNSIGNED_BYTE to GL_UNSIGNED_INT_2_10_10_10_REV. I assume that a pixel is a complete RGB triple, so would it follow that if you specify GL_UNSIGNED_SHORT, that you would be storing your RGB components across 16 bits? That doesn't sound correct.  
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!