Rendering alpha transparency in geometry and figuring out alpha color in bitmap

Started by
4 comments, last by DanielWilkins 12 years, 5 months ago
[font="Arial,"][size="4"]This is a two part question. I am rendering some BSP geometry and some of the textures have an alpha color which should be invisible. The textures are in .bmp format and I would like to avoid converting them if possible as I am dealing with emulation and would like to avoid any possible legal issues.[/font]

[font="Arial,"][size="4"]So the first step would be figuring out the alpha color so I can correctly render it. This is a preview of the bitmap (in .png format) and here is a link to the actual file in case any of you knowledgeable folk know how to figure out what the alpha color should be without simply looking at it and sampling it. Here is the link: [/font]http://www.mediafire...3q45ix1&thumb=4



[font="Arial,"][size="4"]MVQJJ.png[/font]

[font="Arial,"][size="4"]As you can see, it is a railing texture where RGB(160,158,148) is the color that should be invisible. I can figure this out just by looking at it. The problem is, how do I find this value in the file?[/font]

[font="Arial,"][size="4"]So that is step one. The second is an OpenGL issue where I am rendering a bunch of geometry with vertex arrays. How do I accommodate for the fact a color needs to be removed from the texture when rendered? Do I need to render this in a specific order? Also, I should mention I am using SDL if that helps make any of this a bit easier. Someone suggested [/font]http://www.opengl.org/wiki/Transparency_Sorting#Alpha_test but I want to make sure I am using the best method.

[font="Arial,"][size="4"]Thanks![/font]
Advertisement
On the first question, two thoughts:
* You just sample the top-left-corner pixel. Whatever color it is, becomes transparent. You could possibly define the x-y of the sample point per-texture.
* If you can modify the image, you choose a less arbitrary color. Many examples I've seen use magenta (255,0,255) as the official transparency color, as (in these examples) it's fairly unlikely to be encountered in the image itself.

On the second question:
I don't use SDL, but for OpenGL, you probably want to convert the image to RGBA when you import it. Set A=1 for all pixels unless they match your transparency color, in which case A=0.

Using an Alpha blending method is likely to be just about as fast as any other technique you're likely to run into, and if you've got hardware acceleration, then there's unlikely to be any performance penalty at all.

And out of interest - why can't you change the format? I ended up using PNG for pushing assets into my game, since it supports varying alpha levels, the file size is tiny and the library is very liberally licensed, and then in-game I'm free to store the data in any way I please.

On the first question, two thoughts:
* You just sample the top-left-corner pixel. Whatever color it is, becomes transparent. You could possibly define the x-y of the sample point per-texture.
* If you can modify the image, you choose a less arbitrary color. Many examples I've seen use magenta (255,0,255) as the official transparency color, as (in these examples) it's fairly unlikely to be encountered in the image itself.

On the second question:
I don't use SDL, but for OpenGL, you probably want to convert the image to RGBA when you import it. Set A=1 for all pixels unless they match your transparency color, in which case A=0.

Using an Alpha blending method is likely to be just about as fast as any other technique you're likely to run into, and if you've got hardware acceleration, then there's unlikely to be any performance penalty at all.

And out of interest - why can't you change the format? I ended up using PNG for pushing assets into my game, since it supports varying alpha levels, the file size is tiny and the library is very liberally licensed, and then in-game I'm free to store the data in any way I please.


Thank you for the fantastic reply. I am emulating a rendering procedure for an old game. This game's assets are packed into archives. This company has a history of frowning upon emulation of their products and if I have software that is converting their files into different kinds, they could potentially have an issue with that. I want to try and recreate their engine using as little modification to their files as possible. Your answer was fantastic however and I will use the method I linked in terms of rendering to only render alpha values greater than say 0.5.

By the way, do you have any code examples of people converting RGBs to RGBAs. I assume it's just recreating the header and allowing for 4 bytes rather than 3 for each pixel. Am I correct?

Thanks again! :)



By the way, do you have any code examples of people converting RGBs to RGBAs. I assume it's just recreating the header and allowing for 4 bytes rather than 3 for each pixel. Am I correct?


It's as easy as it sounds.

The conversion I use could be expressed as follows in pseudocode:
[source lang="cpp"]
unsigned char *ToRGBA(unsigned char *src)
{
unsigned char *dest = new unsigned char[numPixels * bytesPerPixel];
for(unsigned int pixel = 0; pixel < numPixels; pixel++) {
dest->R = *src++;
dest->G = *src++;
dest->B = *src++;
dest->A = 0xFF;
}
}
return dest;
}
[/source]

Where my RGB image is assumed to be fully opaque (0xFF) while you'd probably have some logic there to check the RGB value first. Just make sure you get the RGB out in the correct order for your file format - it's easy enough to test by just looking at the buffer in a debugger.


[Edited to add:]

I assume it's just recreating the header

If you're passing it to OpenGL there's no header. It's just the buffer of pixel data.

[quote name='Chanz' timestamp='1320289977' post='4880003']
By the way, do you have any code examples of people converting RGBs to RGBAs. I assume it's just recreating the header and allowing for 4 bytes rather than 3 for each pixel. Am I correct?


It's as easy as it sounds.

The conversion I use could be expressed as follows in pseudocode:
[source lang="cpp"]
unsigned char *ToRGBA(unsigned char *src)
{
unsigned char *dest = new unsigned char[numPixels * bytesPerPixel];
for(unsigned int pixel = 0; pixel < numPixels; pixel++) {
dest->R = *src++;
dest->G = *src++;
dest->B = *src++;
dest->A = 0xFF;
}
}
return dest;
}
[/source]

Where my RGB image is assumed to be fully opaque (0xFF) while you'd probably have some logic there to check the RGB value first. Just make sure you get the RGB out in the correct order for your file format - it's easy enough to test by just looking at the buffer in a debugger.


[Edited to add:]

I assume it's just recreating the header

If you're passing it to OpenGL there's no header. It's just the buffer of pixel data.
[/quote]

Thank you! I had no idea the only thing passed was pixel data and the rest was not. Makes sense. Thanks again for your help.
One followup question if you guys are up for it:
How would I deal with rendering a texture with say, half transparency. Could I still use the functions above with different arguments with the alpha of textures set to 0.5 instead?

How does that work?

Thanks.

This topic is closed to new replies.

Advertisement