Problem when rendering transparent objects into a floating point target on ATI

Started by
7 comments, last by execom_rt 18 years, 6 months ago
Using an ATI Radeon 9800, Catalyst 5.9, Windows XP. When using D3DRS_ALPHABLENDENABLE when rendering into a floating point texture, transparency is disabled. Which is a quite a bugger, if you want to render lens flares or particles in HDR. To reproduce the bug (see below). - In the DirectX 9.0c SDK, Open $DXSDK\Samples\C++\Direct3D\HDRLighting project. - In the function 'RenderScene' (around line 1305), add g_pd3dDevice->SetRenderState( D3DRS_ALPHABLENDENABLE, TRUE); g_pd3dDevice->SetRenderState( D3DRS_SRCBLEND, D3DBLEND_ONE); g_pd3dDevice->SetRenderState( D3DRS_DESTBLEND, D3DBLEND_ONE); and restore the blending state a the end of the function g_pd3dDevice->SetRenderState( D3DRS_ALPHABLENDENABLE, FALSE); - Recompile and start, the scene (pillars) should be transparents. On my machine, it is not. - To see, the correct way, changes in line 724 '//Create the HDR scene texture' and changes the parameter of the CreateTexture from D3DFMT_A16B16G16R16F to D3DFMT_A8R8G8B8. - Run again, the scene is transparent as expected. Can people reproduce on nVidia or other platforms ? Is it a limitation of rendering into floating point textures ? Cheers;
Advertisement
The early FP-capable cards can't do floating-point blending. The newer Geforces (7xxx) can, I can't say for sure about ATI cards.

You can check the texture formats suitable for blending by using CheckDeviceFormat() and querying for D3DUSAGE_QUERY_POSTPIXELSHADER_BLENDING. The SDK warns that you should except this to fail for any floating-point formats (even though it works on the new cards).

Niko Suni

I don't think it's really hardware limited.

Doing a render into a texture with OpenGL, using an internal format of GL_RGB16F_ARB and texture_type GL_FLOAT, with transparent objects works on the same video card ...

[EDIT] :

D3DUSAGE_QUERY_POSTPIXELSHADER_BLENDING is only for alpha blending in a pixel shader. This is not my case. I don't use pixel shader (just plain render to a texture).
Nevertheless, I've been told it is a hardware limitation and the query I mentioned best describes it on D3D9. Do correct me if you find that I'm actually wrong though [smile]

You could contact ATI dev department about this, if you want to be absolutely sure. I've done that with NVidia.

OpenGL is generally more loose on the capabilities than D3D, because the card manufacturers do not need to adhere to as strict specifications with extensions as with D3D. This is both the greatest strength and the biggest failure of OpenGL, in my opinion.

Niko Suni

I've contacted ATI, and waiting their answer. At least, it easy to reproduce.

The weird thing, as I said before, is that you can do it in OpenGL, but not in DirectX 9 :

I've ported the HDR lighting DirectX 9.0 demo to OpenGL 2.0, but it is running slower compared to DX9, but transparency is working in OpenGL (maybe it's emulated behind the scenes ?)

Finally, running the program with DX9 Debug Runtime, with maximum debug output, shows not warning or error when trying to uses ALPHABLENDENABLE during a render into a fp texture : I was expecting an error like 'unsupported', instead of silently reporting nothing.

Cheers;
Quote:Original post by execom_rt
I've contacted ATI, and waiting their answer. At least, it easy to reproduce.

The weird thing, as I said before, is that you can do it in OpenGL, but not in DirectX 9 :

I've ported the HDR lighting DirectX 9.0 demo to OpenGL 2.0, but it is running slower compared to DX9, but transparency is working in OpenGL (maybe it's emulated behind the scenes ?)

Finally, running the program with DX9 Debug Runtime, with maximum debug output, shows not warning or error when trying to uses ALPHABLENDENABLE during a render into a fp texture : I was expecting an error like 'unsupported', instead of silently reporting nothing.

Cheers;


It may very well be emulated in software for OpenGL; the specification doesn't prohibit implementing it like that, unlike D3D specification which requires hardware implementation (so it "becomes" a hardware issue).

The HDR blending is a relatively new feature, so they've probably just missed the warning there. I can escalate this issue to the DirectX team so they may fix it in a future core release, but as it's not exactly critical, you may have to wait for a year for it. Meanwhile, the flag I mentioned does expose the hardware's capabilities regarding fp blending.

Do post the ATI answer here, when you get it!

Niko Suni

Will do.

I'm trying to figure out to 'circumvent' the problem.

I mean games like 'HL2-Lost Cost' is using HDR lighting, so they needs to render their scene into a fp render target, but probably this doesn't work on R300 (Maybe a R520 or Geforce 7 is required to run this game) when the scene have alpha blended textures (like trees) or particles (see http://www.bit-tech.net/gaming/2005/06/14/hl2_hdr_overview/1.html)

Probably what I can do is (for already old R300 hardware) :

- Render the non transparent objects into FP render target.
- Render the tranparents objects into a RGB render target.
- Write a pixel shader to blend the two target (probably for each type of blending) and write a result into a new FP render target
- Process the HDR tone mapping.

Don't know if this was discussed before.

You could see what makes the SDK sample "HDRFormats" tick - it emulates HDR by using 32-bit integer format to encode the exponent of the color into the alpha component, and runs on older cards also. You can of course use another texture as the actual alpha, if you need it.

Niko Suni

I've done this:

if (SUCCEEDED( g_lpD3D->CheckDeviceFormat( 0, D3DDEVTYPE_HAL,
D3DFMT_A16B16G16R16F, D3DUSAGE_RENDERTARGET | D3DUSAGE_QUERY_POSTPIXELSHADER_BLENDING, D3DRTYPE_SURFACE, D3DFMT_X8R8G8B8 )))
{
// Can render transparency object in floating point buffers
}
else
{
// Not possible
}

and it fails on mine. Don't know if this test is correct or not.

This topic is closed to new replies.

Advertisement