Jump to content
  • Advertisement
Sign in to follow this  
Oni Sephiroth

Textures Won't Bind

This topic is 3368 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've been attempting to rewrite my graphics engine recently to speed it up, but I'm running into the issue of my textures not binding. This is my function for loading the image:
unsigned char * HVSTGFX::loadPNGFile(char *fileName, HVSTGFX::PNGFILE *pngfile, GLuint &texture)
{
	FREE_IMAGE_FORMAT fif = FIF_PNG;

	//pointer to the image, once loaded
	FIBITMAP *dib(0);
	//pointer to the image data
	unsigned char* bits(0);
	unsigned char tempRGB;
	GLuint tempTex = 0;
	

	if(FreeImage_FIFSupportsReading(fif))
		dib = FreeImage_Load(fif, fileName);

	if(!dib)
		return NULL;


	bits = FreeImage_GetBits(dib);
	pngfile->width = FreeImage_GetWidth(dib); pngfile->height = FreeImage_GetHeight(dib);
	pngfile->size = sizeof(bits);
	int size = pngfile->width*pngfile->height;//(FreeImage_GetWidth(dib) * FreeImage_GetHeight(dib));

	for (int imageIDx = 0; imageIDx < size * 4; imageIDx += 4)
	{
		tempRGB = bits[imageIDx];
		bits[imageIDx] = bits[imageIDx + 2];
		bits[imageIDx + 2] = tempRGB;
	}
	//FreeImage_Unload(dib);
	glGenTextures(1, &tempTex);
	glBindTexture(GL_TEXTURE_2D, tempTex);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
	glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pngfile->width, pngfile->height, 0, GL_RGBA, GL_UNSIGNED_BYTE, bits);
	texture = tempTex;
	
	
	return bits;
}
After watching it in debug mode, I've noticed that the GenTextures function isn't doing anything to tempTex, which is probably where the problem is. But, just to be sure, this is my function for actually drawing the sprite:
void HVSTGFX::createSpritePNGX(float width, float height, float x, float y, CSprite sprite)
{
	glBindTexture(GL_TEXTURE_2D, sprite.texture);
	glColor4f(1.0f,1.0f,1.0f, 1.0f );
		
		glBegin(GL_QUADS);
			glTexCoord2f(0.0f, 0.0f); glVertex3f(x, y, 0.0f);
			glTexCoord2f(1.0f, 0.0f); glVertex3f(x + width, y, 0.0f);
			glTexCoord2f(1.0f, 1.0f); glVertex3f(x + width, y + height, 0.0f);
			glTexCoord2f(0.0f, 1.0f); glVertex3f(x, y + height, 0.0f);
		glEnd();
}
I've been digging around my textbooks and everywhere else on the internet, I just can't seem to find an answer as to why this isn't working.

Share this post


Link to post
Share on other sites
Advertisement
If tempTex = 0 after glGenTextures, then glGenTextures must have failed - most likely because OpenGL is not initialized for the current thread. I suspect the problem is that you are reading your image before you initialize OpenGL (or you are doing it in a different thread than the one where you initialized OpenGL).

Share this post


Link to post
Share on other sites
Yes, GL is enabled, it's being done on WM_CREATE, as you can see by this function:

void HVSTGFX::initGL()
{
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glFrontFace(GL_CCW);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glEnable(GL_ALPHA_TEST);


}


Also, I should add that calling glTexImage2D before drawing a sprite works, but this is what's causing my current engine to be so slow.

Share this post


Link to post
Share on other sites
Calling glTexImage2D loads the data into the current texture memory - so calling it every time you draw will clearly work.

If you do something like

int main() {

loadImages();

initializeWindow();

}

It will not work, because when you call loadImages(), you have not yet created the OpenGL rendering context. I can almost guarantee that this is your problem.

Share this post


Link to post
Share on other sites
I have this being done on WM CREATE

hdc = GetDC(hwnd);
g_hdc = hdc;
HVSTGFX::SetupPixelFormat(hdc);
hrc = wglCreateContext(hdc);
wglMakeCurrent(hdc, hrc);
HVSTGFX::initGL();
FreeImage_Initialise(TRUE);


g_hdc is a global device context.

Unless this is considered a different thread from winMain (since it's called in mainWndProc?), then GL should be initializing. Unless I'm wrong and misunderstood something, that is.

Share this post


Link to post
Share on other sites
Yes, that seems OK - but I am curious - how/where is HVSTGFX::loadPNGFile called?

You might try

GLenum errCode;
const GLubyte* errString;
errCode = glGetError();
if (errCode != GL_NO_ERROR)
{
cout << gluErrorString(errCode) << endl;
}

After glGenTextures (which must be failing). Hopefully this will give you useful information.

Share this post


Link to post
Share on other sites
I'm calling loadPNGFile just before the main loop in winMain, like this"

newSprite.sprite = HVSTGFX::loadPNGFile("resource/sprites/HUD/hudE.png", &newSprite.pngFile, newSprite.texture);



Share this post


Link to post
Share on other sites
I see no glTexParameteri( .... GL_REPEAT ); stuff

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pngfile->width, pngfile->height, 0, GL_RGBA, GL_UNSIGNED_BYTE, bits); <-- break point here see if bits contains data

failing that try glintercept, gldebugger which will let u view any oprngl textures

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!