Problem with glBindTexture

Started by
1 comment, last by ddx 17 years, 4 months ago
.........
Advertisement
Hey man,

Its been several months since I've looked at any OpenGL code, and my code's at home while I'm in a lab. That said, lemme give it a try.

The error message says that your second parameter in that call, which is textureName, is an unsigned int *k, and what it wants is a GLuint. I don't see your definition of textureName anywhere, but I'm guessing its a global array or STL vector of uints. OpenGL provides all its own number types (e.g. GLunit) that you should try and get used to using instead of what you're used to...that is, when you're interfacing with OpenGL. (Someone else may better be able to explain why.)

Try defining textureName as a vector (or array) of GLunit instead of unsigned int pointers. And stop using globals! =) They cause confusion like this. <3

Good luck.
Why is your vector of texture names holding a GLuint* ? Just make it a std::vector<GLuint>. In your current code, you initialize the vector to hold a bunch of zero pointers, then you pass those invalid pointers to glGenTextures. That's not good. glGenTextures takes a pointer to where it should put the texture name; that location better be valid. Making this change will fix your error message, too.

After you change the vector to just hold GLuint, you can use glGenTextures(1, &textureName[x]); (First param is how many texture names to generate, you currently use 0...)

This topic is closed to new replies.

Advertisement