Sign in to follow this  
AfroFire

OpenGL Simple Texturing Gone Awry

Recommended Posts

Hey guys, I'm using SDL with the SDL Image library, and I am attempting to texture a single quad. Okay, initially it worked just fine with the code I am about to post - however, this was before I had my classes inheriting eachother. For example - I have an APPLICATION class and an APP_CON class.
class APPLICATION: public APP_CON
{
   APPLICATION: APP_CON()
    {

      // blah blah
   }


   // blah blah
};

and inside of APP_CON:
class APP_CON
{
   public:
       APP_CON()
       {
             // blah blah
             loadBg();
            // blah blah
        }

       void loadBg()
       {
          bg_img[0] = IMG_Load("bg.png");
          if(!bg_img[0])
          {
             cout << "Couldn't load bg.png: " << IMG_GetError() << endl;
             exit(1);
          }

          glGenTextures(1, &texName);
          if(glGetError() == GL_INVALID_VALUE)
          {
             cout << glGetError() << ": GenTexture failed! Invalid Value." << endl;
          }
          glBindTexture(GL_TEXTURE_2D, texName);
          if(glGetError() == GL_INVALID_ENUM)
          {
             cout << glGetError() << ": BindTexture failed! Invalid Enum -                    target not valid." << endl;
          }
          else if(glGetError() == GL_INVALID_OPERATION)
          {
              cout << glGetError() << ": BindTexture failed! Invalid operation - texture dimensionality doesn't match target." << endl;
          }

          cout << texName << ": Applying texture Parameters." << endl;

          glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
          glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);


          glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, bg_img[0]->w, bg_img[0]->h, 0, GL_RGB, GL_UNSIGNED_BYTE, bg_img[0]->pixels);      
       }

       void con_drawFrame()
       {
         glEnable(GL_TEXTURE_2D);

         if(hidden)
         {
            glMatrixMode(GL_PROJECTION);
            gluOrtho2D(0, width, height, 0);
            glLoadIdentity(); 
       
            glColor3f(bg_color[0], bg_color[1], bg_color[2]);
            glBindTexture(GL_TEXTURE_2D, texName);
            if(glGetError() == GL_INVALID_ENUM)
            {
               cout << glGetError() << ": BindTexture failed! Invalid Enum - target not valid." << endl;
            }
            else if(glGetError() == GL_INVALID_OPERATION)
            {
            cout << glGetError() << ": BindTexture failed! Invalid operation - texture dimensionality doesn't match target." << endl;
            }

            glBegin(GL_QUADS);
    
              glTexCoord2f(0.0f, 0.0f); glVertex2f(-1.0f, 1.0f);
              glTexCoord2f(0.0f, 1.0f); glVertex2f(-1.0f, 0.0f);
              glTexCoord2f(1.0f, 1.0f); glVertex2f(1.0f, 0.0f);
              glTexCoord2f(1.0f, 0.0f); glVertex2f(1.0f, 1.0f);

            glEnd();

            glMatrixMode(GL_MODELVIEW);
         }
         glDisable(GL_TEXTURE_2D);
       }
};

So what I do with this code is create a CLIENT class that inherits APPLICATION which inherits APP_CON. When I compile and run it - all I get is a white quad. However, before, when I had APP_CON as a member of APPLICATION it would compile and display my quad just fine - I am a bit confused as to why this wouldn't work. Is it some sort of memory issue? I dont think it could be that because OpenGL returns a GL_INVALID_OPERATION on glBindTexture - in my reference manual it says that "GL_INVALID_OPERATION is generated if texture has a dimensionality that doesn't match that of target." Its a freakin Unsigned int matching up with an Enum - what dimensionality could I possibly be missing? Especially if any information it holds is generated by glGenTextures(...). Anyone have any ideas? Im new to this whole thing but I've been trying hard to figure it out on my own.

Share this post


Link to post
Share on other sites
You must have a valid OpenGL context before you can perform OpenGL operations. It is likely that your APPLICATION class has a member which creates your OpenGL context in it's constructor (I assume that's where the window creation code is?). If so then when you combined APPLICATION with APP_CON using composition (APPLCIATION has-a APP_CON) the APPLICATION constructor first created the object which creates your OpenGL context and then created the APP_CON object, which needs the OpenGL context.

When you combine APPLICATION with APP_CON using inheritance (APPLCIATION is-a APP_CON) the APP_CON constructor, which requires the OpenGL context, runs before the members of APPLICATION, one of which creates the OpenGL context, are constructed.

Looking at the code you posted composition seems a much more sensible approach than inheritance. An application has a background, rather than an application is a background.

Enigma

Share this post


Link to post
Share on other sites
Enigma thank you very much.

I see now what you mean - and I was actually thinking that exactly - there is no reason for it to be inherited, but before I was going to just give up - I was going to find out WHY it wasn't working - I just didn't step back far enough to follow the program logically.

Thank you very much.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this