Sign in to follow this  

Textured Quads Appear White

This topic is 1117 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've got basic texturing working, but for some reason, these quads are appearing white. The shader looks correct, and if I enabling alpha blending, then nothing appears onscreen. I think the blending suggests that the pixels for the format I've set is zero for everything. Here is my code:

SpriteBinRect *SpriteBinRect::Load(std::string filename, std::string path)
{
	SpriteBinRect *spriteBinRect = NULL;
	void *info = NULL, *data = NULL;
	std::string fullPath = path + filename;
	FREE_IMAGE_FORMAT fif = FIF_UNKNOWN;
	int width, height, bitsPerPixel, size;
	int imageFormat;
	int type = GL_UNSIGNED_BYTE;
	
	
	// attempt to open the file
	fif = FreeImage_GetFileType(fullPath.c_str(), 0);
	if(fif == FIF_UNKNOWN) fif = FreeImage_GetFIFFromFilename(fullPath.c_str());
	if(fif == FIF_UNKNOWN)
	{
		std::cout << "ERROR: Couldn't resolved file type for " << filename.c_str() << std::endl;
		return NULL;
	}
	
	// check that the plugin has reading capabilities and load the file
	if(FreeImage_FIFSupportsReading(fif))
		info = FreeImage_Load(fif, fullPath.c_str());
	if(!info)
	{
		std::cout << "ERROR: invalid file type for " << filename.c_str() << std::endl;
		return NULL;
	}
	
	// retrieve the image data
	FreeImage_FlipVertical((FIBITMAP*)info);
	data = FreeImage_GetBits((FIBITMAP*)info);
	width = FreeImage_GetWidth((FIBITMAP*)info);
	height = FreeImage_GetHeight((FIBITMAP*)info);
	bitsPerPixel = FreeImage_GetBPP((FIBITMAP*)info);
	FreeImage_ConvertToType((FIBITMAP*)info, FIT_UINT32);
	size = width * height * (bitsPerPixel / 8);
	SwapRedBlue32((FIBITMAP*)info);
	
	// if this somehow one of these failed (they shouldn't), return failure
	if(!data || width < 2 || height < 2)
	{
		std::cout << "ERROR: invalid data for " << filename.c_str() << std::endl;
		return NULL;
	}
	
	// set the correct format
	switch(bitsPerPixel)
	{
		case 32:
			imageFormat = GL_RGBA;
			type = GL_UNSIGNED_BYTE;
			break;
			
		case 24:
			imageFormat = GL_RGB;
			type = GL_UNSIGNED_BYTE;
			break;
			
		case 16:
			imageFormat = GL_RGBA;
			type = GL_UNSIGNED_SHORT_5_5_5_1;
			break;
			
		case 8:
			imageFormat = GL_R;
			type = GL_UNSIGNED_BYTE;
			break;
	}
	
	// allocate and setup the texture
	spriteBinRect = new SpriteBinRect((float)width, (float)height);
	glActiveTexture(GL_TEXTURE0);
	glGenTextures(1, &spriteBinRect->handle);
	glBindTexture(GL_TEXTURE_2D, spriteBinRect->handle);
	glTexStorage2D(GL_TEXTURE_2D, 1, GL_RGBA8, width, height);
	glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, width, height, imageFormat, type, data);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
	
	// release client-side texture data, and return the bin rect
	FreeImage_Unload((FIBITMAP*)info);
	return spriteBinRect;
}

I'm using FreeImage to load my images, and I'm pretty sure that's working. I printed out the pixels for my images, and was getting a variety of different numbers per color component. I've also verified that these images are 32-bit. I'm also making sure that an active texture unit is set, and filtering doesn't use mipmaps.

Share this post


Link to post
Share on other sites

I solved it. Turns out, the code above was fine, and the issue was within my vertex buffer:

glGenBuffers(2, vbo);
	glBindBuffer(GL_ARRAY_BUFFER, vbo[0]);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vbo[1]);
	glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex) * numVertices, vertices, GL_STATIC_DRAW);
	glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(uint16_t) * numIndices, indices, GL_STATIC_DRAW);
	glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)0);
	glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)sizeof(Vector2));
	glEnableVertexAttribArray(0);
	glEnableVertexAttribArray(1);
	delete vertices;
	delete indices;

My second vertex attribute was set to 4 for the size, GL_UNSIGNED_BYTE for the type and GL_TRUE for the normalized arguments. This was a simple copy and paste mistake as the vertex format from another test state uses position and color. This one uses position and texture coordinates. False alarm!

Share this post


Link to post
Share on other sites

This topic is 1117 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this