Jump to content
  • Advertisement
Sign in to follow this  
Guinnie

Strange texture distortion / possible alpha or bitmap issues

This topic is 3170 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

'lo, all. To give you some idea of where I'm at: I need foliage in my scene so I set about implementing alpha testing as it seems to perform best and I can live with untidy edges (and may experiment with AA to improve those later). So the way I did this was... because I need pixels on the texture to be either transparent or opaque with no inbetween - as anything inbetween was causing a white outline around the opaque parts - I wrote a program that uses auxDIBImageLoad to load the texture image and a mask (basically, a black and white version of the same texture). It loops through each RGB value and writes them to a new file, but it also checks if the RGB value of the mask at the same location is white. If it's pure white (FF FF FF) then it writes white to the alpha value; otherwise, it writes black. It also writes the width and height to the top of the file... so it's basically an A8R8G8B8 bitmap. So then after running a 24-bit bitmap and its 24-bit mask through this program, I'd end up with one 32-bit bitmap which I'd then load using normal file reading functions (so no auxDIBImageLoad) and pass to glTexImage2D using all relevant arguments. With the first two/three textures I tried it with (basically variations of the same file, but different experimental leaf effects), it worked. ... but then I tried it with another copy with new leaves and different sized planes for the leaf texture (this time elongated instead of square) but the same size texture (always 512 x 512) and I got this: Using glAlphaFunc(GL_GREATER, 0.99f); as was the original value I used Using glAlphaFunc(GL_GREATER, 0.1f); What's weird is that when I made a test bitmap which was just red with white, green and blue in the corners (so I could easily see it was loading the whole image), ran it through the program with the same mask and then used the resulting file with the model, I get this: Using either 0.99 or 0.1 Which is basically correct in terms of mapping and transparency/opacity. As soon as I switch it back to the others, it glitches out again. So any ideas what could be going on? I'm pretty sure that somewhere down the line, the new bitmap format has become muddled up... so here's some code: This is the "merger" program that takes the mask and creates the A8R8G8B8 wannabe. To the best of what I could tell, the data created by auxDIBImageLoad is just 3 bytes for the R, G and B values. This worked fine so I assume it's correct.
int main()
{
	char filename[64];
	int size;
	FILE *file;
	int x, y;

	cout << "File to merge: ";

	cin >> filename;

	AUX_RGBImageRec *Mask = auxDIBImageLoad("mask.bmp");

	AUX_RGBImageRec *Bitmap = auxDIBImageLoad(filename);

	file = fopen("product.rgba", "w+");

	  // Write width and height.

	fwrite(&Bitmap->sizeX, 4, 1, file);
	fwrite(&Bitmap->sizeY, 4, 1, file);

	  // Calculate buffer size.

	size = Bitmap->sizeX * Bitmap->sizeY * 3;

      // Either opaque or transparent.

	char masks[2] = { 0x00, 0xFF };
	
	for(int i = 0; i < size; i += 3)
	{
		  // Write the R, G and B values.

        fwrite(&Bitmap->data, 3, 1, file);

          // See if the mask RGB is white.

		if(Mask->data == 0xFF && Mask->data[i+1] == 0xFF && Mask->data[i+2] == 0xFF)
		{
			 // Write white (opaque).

		    fwrite(&masks[1], 1, 1, file);
		}
		else
		{
			  // Write black (transparent).

			fwrite(&masks[0], 1, 1, file);
		}
	}

	fclose(file);

	return 0;
}


This is the loading code:
GLuint LoadGLTexture(const char *filename)
{
	GLuint texture = 0;
	int size;
	FILE *file;
	char *data;
	int x, y;

	file = fopen("product.rgba", "r");

	  // Read width and height (top 8 bytes).

	fread(&x, 4, 1, file);
	fread(&y, 4, 1, file);

	  // Calculate size of the image in bytes (how much is left in the file).

	size = x * y * 4;

	data = new (std::nothrow) char [size];

      // Read in the rest of the file.

	fread(data, size, 1, file);

	fclose(file);
	
	if(data != NULL)
	{
		glGenTextures(1, &texture);

		glBindTexture(GL_TEXTURE_2D, texture);

		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

		  // Create texture as RGBA of size x * y.

		glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, x, y, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);

            delete [] data;
	}

	return texture;
}


And not that it makes much difference, but my alpha test setup is:
glBlendFunc(GL_SRC_ALPHA, GL_ONE);					

/* One thing that might be worth noting is that
if I put glEnable(GL_BLEND); here, there's a lot more colour
in the distorted textures, but that's the only difference;
they're still messed up. */

glAlphaFunc(GL_GREATER, 0.1f); // or 0.99f
glEnable(GL_ALPHA_TEST);


If there's anything else you need to see, let me know. Thanks in advance. [edit] This has since been solved. After hours of trying out different bitmap loaders and analyzing the files in a hex editor, I realised the problem was caused by not opening the file with "b" for binary writing - so the code was putting in cute little newlines and knocking everything out of line. [Edited by - Guinnie on October 17, 2009 6:09:04 AM]

Share this post


Link to post
Share on other sites
Advertisement
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!