Afternoon all,
I have a class called Texture, that I need to be able to initialize once on my programs startup with a texture, and then at runtime replace the texture with one of the users choosing from their HDD. Initializing the Texture works fine, and the initial texture displays as it should, being given a name by glGenTextures, and then binding the pixel data under that name. That is done in the following functions:
Texture::Texture(string filename)
{
//textureID[0]=0;
const char* fnPtr = filename.c_str(); //our image loader accepts a ptr to a char, not a string
//printf(fnPtr);
lodepng::load_file(buffer, fnPtr);//load the file into a buffer
unsigned error = lodepng::decode(image,w,h,buffer);//lodepng's decode function will load the pixel data into image vector from the buffer
//display any errors with the texture
if(error)
{
cout << "\ndecoder error " << error << ": " << lodepng_error_text(error) <<endl;
}
//execute the code that'll throw exceptions to do with the images size
checkPOT(w);
checkPOT(h);
//loop through and //printf our pixel data
/*for(GLuint i = 0; i<image.size(); i+=4)
{
//printf("\n%i,%i,%i,%i,", image.at(i),image.at(i+1),image.at(i+2),image.at(i+3));
}*/
////printf("\nImage size is %i", image.size());
//image now contains our pixeldata. All ready for OpenGL to do its thing
//let's get this texture up in the video memory
texGLInit();
Draw_From_Corner = CENTER;
}
void Texture::texGLInit()
{
glGenTextures(1, &textureID[0]);
////printf("\ntextureID = %u", textureID[0]);
glBindTexture(GL_TEXTURE_2D, textureID[0]);//evrything we're about to do is about this texture
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8,w,h,0, GL_RGBA, GL_UNSIGNED_BYTE, &image[0]);
//we COULD free the image vectors memory right about now.
image.clear();
}
Upon calling my draw function this texture displays just fine. However, upon clicking "import new spritesheet", an openFileDialogue appears and the user selects a new file. The following code executes, which should (in theory) delete the texture name (stored in the GLUint 'TextureID' array), and make it available for use again, allowing us to bind new texture data under that name.
void Texture::reloadTexture(string filename)
{
//first and foremost clear the image and buffer vectors back down to nothing so we can start afresh
buffer.clear();
image.clear();
w = 0;
h = 0;
//also delete the texture name we were using before
glDeleteTextures(1, &textureID[0]);
const char* fnPtr = filename.c_str(); //our image loader accepts a ptr to a char, not a string
//printf(fnPtr);
lodepng::load_file(buffer, fnPtr);//load the file into a buffer
unsigned error = lodepng::decode(image,w,h,buffer);//lodepng's decode function will load the pixel data into image vector from the buffer
//display any errors with the texture
if(error)
{
cout << "\ndecoder error " << error << ": " << lodepng_error_text(error) <<endl;
}
//execute the code that'll throw exceptions to do with the images size
checkPOT(w);
checkPOT(h);
//loop through and //printf our pixel data
/*for(GLuint i = 0; i<image.size(); i+=4)
{
//printf("\n%i,%i,%i,%i,", image.at(i),image.at(i+1),image.at(i+2),image.at(i+3));
}*/
////printf("\nImage size is %i", image.size());
//image vector now contains our pixeldata. All ready for OGL to do its thing
//let's get this texture up in the video memory OpenGL
texGLSecondaryInit();
Draw_From_Corner = CENTER;
}
void Texture::texGLSecondaryInit()
{
//PFNGLBINDBUFFERARBPROC glBindBuffer = NULL; // VBO Bind Procedure
glGenTextures(1, &textureID[0]);
////printf("\ntextureID = %u", textureID[0]);
glBindTexture(GL_TEXTURE_2D, textureID[0]);//evrything we're about to do is about this texture
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
//glBindBuffer(GL_PIXEL_UNPACK_BUFFER,0);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8,w,h,0, GL_RGBA, GL_UNSIGNED_BYTE, &image[0]);
//we COULD free the image vectors memory right about now.
image.clear();
}
*Note the call to 'glDeleteTextures' close to the start of 'reloadTexture()'.*
What should happen at this point is that the user now sees their selected texture being displayed. But what actually happens is that the user sees the old texture (the one that loads at startup), with the new texture's (the one they selected) dimensions. Meaning that despite my call to glDeleteTextures, the deleted texture's pixel data still exists in one of openGL's buffers somewhere I can't figure out where, nor why it would still be there after i have not only deleted it, but also attempted to bind a different files data over the top of it.
Thanks for reading, and I'd appreciate any light that could be shed on why openGL would hang-on to the old data.
P.S. In the function 'texGLSecondaryInit()' , you may notice commented out calls to glBindBuffer(). This was suggested to me as a way to purge the pixel buffer on another board. It caused Access Violation Errors all over the shop. However, I've left it on display here, in case it helps you understand what I'm trying and failing to do.