Changing rendertarget from A8R8G8B8 to A32B32G32R32F

Started by
7 comments, last by Muhammad Haggag 18 years, 5 months ago
Hi, i pre-render something in a texture (used as a lookup in a pixel shader). So far, I took the texture format A8R8G8B8, but now I need floats and wanted to change the format to A32B32G32R32F. Unfortunately, I can't see the pre-rendered stuff anymore. Can anyone tell me why?? Is it because the channels are ordered differently now (ARGB -> ABGR)??
_renderTexture = new Texture(renderDevice, renderDevice.Viewport.Width, renderDevice.Viewport.Height, 1, Usage.RenderTarget, Format.A8R8G8B8, Pool.Default);

// becomes:

_renderTexture = new Texture(renderDevice, renderDevice.Viewport.Width, renderDevice.Viewport.Height, 1, Usage.RenderTarget, Format.A32B32G32R32F, Pool.Default);

[Edited by - Coder on November 4, 2005 11:44:03 PM]
Advertisement
Have you got a graphics card that supports High Dynamic Range (HDR) rendering? Only the last few generations (Some GeForceFX's, Radeon9600 up) support it, but only the current top-of-the-line cards support it at a usable/practicle speed.

If not, you're flat out of luck and you can't do what you're trying [smile]

You really should be enumerating the drivers/hardware before making calls such as these - render targets tend to be very hardware specific (most hardware will only allow ~6 basic formats). I have the enumeration code here for C++, but looks like you're using MDX so it's probably not that much use [headshake]

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Actually, I've a Mobility Radeon 9700. It's capable of float textures, even as render target.

I think, if it wouldn't be capable, I'd get an error or exception when trying to instanciate it.
Quote:Original post by data2
Actually, I've a Mobility Radeon 9700. It's capable of float textures, even as render target.

You're probably correct, but unless you enumerate in your code I wouldn't rely on it. I've had hardware (*cough*GeForceFX*cough*) that added/removed features depending on the driver revision. Unless the driver is exposing it to D3D you can't use it.

At the very least, fire up DXCaps and see what it's reporting.

Quote:Original post by data2
I think, if it wouldn't be capable, I'd get an error or exception when trying to instanciate it.

Have you verified this against the debug runtimes? Maybe MDX is tighter, but quite a lot of DX calls can 'silently fail' - if you don't check it in your code the debug spew is usually the only place it'll show up [smile]

But, that aside... Say your hardware does support ARGB32F. You need to describe your error - you can't see it anymore doesn't give us much to go on!

Again, check for any errors on drawing calls, verify against the debug runtimes. Try saving the results of your texture to disk and inspecting them in DXTex (or similar).

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

One more thing that is easily overlooked - That card doesn't support alphablending on floating point render targets. That is likely where your problem lies.

Generally, use CheckDeviceFormat for your render target format with a D3DUSAGE_QUERY_POSTPIXELSHADER_BLENDING for usage, but the Radeon 9700 and even the Radeon X800 don't support it for floating point targets.

Good luck!

[edit] Also, my card didn't give me any errors when I tried alpha blending to a floating point source. Quite frustrating really. Motion blur was supposed to be easier with floating point targets, but without alpha blending, you have to do all the blending inside the shaders. <ugh> [/edit]
Quote:Original post by Steven Hansen
One more thing that is easily overlooked - That card doesn't support alphablending on floating point render targets. That is likely where your problem lies.


Actually, I don't have the code next to me right now (different computer) but I think THAT will be the problem! I'm using alpha blending. I've several Meshes that render themselves with additive color blending. So, the AlphaBlendEnable is true!

And, I'll check the caps the next time I'm sitting at the other machine...

The Caps Viewer says that D3DUSAGE_QUERY_POSTPIXELSHADER_BLENDING isn't supported :-(

Any suggestions what to do? Do I have to toggle between 2 textures as rendertarget? For each Mesh I'd use one as render target, the other would be provided as map in the pixelshader. Then, I could mix them together by myself. But that's not the very best way, isn't it?
Quote:Do I have to toggle between 2 textures as rendertarget? For each Mesh I'd use one as render target, the other would be provided as map in the pixelshader. Then, I could mix them together by myself. But that's not the very best way, isn't it?

Really that's about all you can do if you must use floating point render targets and cannot upgrade to a newer card. The R300 chip (what the Radeon 9500+ cards are based on) was the first one to really support floating point surfaces. Unfortunately blending with these surfaces just wasn't ready yet. Since the X800 type cards are also based on the R300 architecture, they don't support blending on floating point targets either.

neneboricua
Code lines were too wide. Replaced code tags with source tags.

This topic is closed to new replies.

Advertisement