Blending DXT1 textures correctly (UPDATE: now about DXT5 as well!)

Started by
8 comments, last by d h k 11 years, 11 months ago
Hi there, I'm trying to draw a number of billboards with textures that use DXT1 compression (with 1-bit alpha channel - ie. a transparent color/color key) and I can't get them to render properly. I'm unsure if there's a way around having to sort these by distance but I've tried doing that as well as using the z-buffer normally and neither way looks correct. These are the render states that I'm using to get the transparency to work:

[source lang="cpp"]
GetDevice ( )->SetRenderState ( D3DRS_SRCBLEND, D3DBLEND_SRCALPHA );
GetDevice ( )->SetRenderState ( D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA );
[/source]

Then I try any combination of using or not using the z-buffer and sorting or not sorting my std::vector of billboards:

[source lang="cpp"]
GetDevice ( )->SetRenderState ( D3DRS_ZENABLE, true );
std::sort ( billboards.begin ( ), billboards.end ( ) );
[/source]

For reference, this is how I overload the < operator for my billboards that makes the std::sort call work:

[source lang="cpp"]
bool operator < ( Object rhs )
{
D3DXVECTOR3 pos_a = this->GetPosition ( );
D3DXVECTOR3 pos_b = rhs.GetPosition ( );

D3DXVECTOR3 distance_a ( camera->position.x - pos_a.x, camera->position.y - pos_a.y, camera->position.z - pos_a.z );
D3DXVECTOR3 distance_b ( camera->position.x - pos_b.x, camera->position.y - pos_b.y, camera->position.z - pos_b.z );

return D3DXVec3Length ( &distance_a ) > D3DXVec3Length ( &distance_b );
}
[/source]

However I always get this render output, no matter whether or not I use the z-buffer and/or sort the std::vector by distance:

AQpUa.png

As you can see, the parts of the billboard texture that are transparent cut out parts of the billboards behind. Additionally, the actual draw order only works correctly when I use the z-buffer and still sort by distance to the camera.

I'd love some clarification: what's the easiest way to render a number of DXT1 compressed textures with 1-bit alpha channel correctly on top of each other? Is manual sorting necessary? What am I doing wrong here?

Thanks ahead of time!
Advertisement
One trick you can use is turning on D3DRS_ALPHATESTENABLE - that will stop the z-buffer getting written to for transparent pixels, and therefore when you don't have any pixels which aren't either solid or opaque you can get away without any depth sorting at all.
That worked, I only had to ALPHAREF and ALPHAFUNC as well, just like in the MSDN article you posted!

Thanks, I wouldn't have thought of this little trick myself!
UPDATE

Now I want to render explosions, these are DXT5 textures with 8-bit alpha channel. Additive blending is trivial but I don't want to do that, I just want to render them straight up while still taking their 8-bit alpha channel into consideration of course.

Same problem, explosions cookie cut everything behind them out. The solution of disregarding pixels based on their alpha value that works fine for DXT1 1-bit alpha channel textures doesn't work here anymore for obvious reasons (I can disregard all pixels with, say, 100 or less in the alpha buffer and it helps the cookie cutter problem but that compromises the 8-bit alpha channel information of course). Is there anything I can do to blend explosions correctly in the described manner?

Any help is greatly appreciated!
Turn off depth writes, but keep depth tests available, then draw all explosions from back-to-front with whichever alpha blend you like, and alpha test disabled


pd3dDevice->SetRenderState(D3DRS_ZENABLE, TRUE);
pd3dDevice->SetRenderState(D3DRS_ZWRITEENABLE, FALSE);
Works perfectly! Thanks a ton! I really need to sort out my blend mode knowledge from the looks of it! :)
No problem :) But I've thought of a good way for you to improve performance. In my last post I told you to turn off alpha-testing. See how many explosions you can set off simultaneously before it starts slowing down. Then add this to the render states code,


pd3dDevice->SetRenderState(D3DRS_ALPHAFUNC, D3DCMP_GREATEREQUAL);
pd3dDevice->SetRenderState(D3DRS_ALPHAREF, 1);
pd3dDevice->SetRenderState(D3DRS_ALPHATESTENABLE, TRUE);


and try the same test. You should see an improvement, because this stops transparent pixels from being processed, which should improve performance depending on your explosion textures.
Well, that alpha test trick to disregard transparent pixels was suggested by the first poster. It works fine for 1-bit alpha (ie. DXT1 textures) but with an 8-bit alpha channel it doesn't work anymore. It would have to disregard all pixels with an alpha value of 254 or lower for it to work and then you would be back to having a 1-bit alpha channel essentially.

Hope this makes sense, if I'm wrong please let me know. It would always be preferred to not have to sort by distance manually for performance reasons of course!
The code I gave will reject any pixels with 100% transparency, which could potentially save a lot of fillrate/shader time (again, depending on the textures you use). The alphafunc and alpharef render states define the condition required for a pixel to be accepted - GREATEREQUAL and 1 means a pixel must have an alpha value of greater than or equal to 1, but remember that alpha ranges from 0 to 255. Trust me, it works
Ohhh okay, you meant doing this in addition to what you suggested earlier. That makes perfect sense, sorry. Misunderstanding

This topic is closed to new replies.

Advertisement