Jump to content
  • Advertisement
Sign in to follow this  
karwosts

OpenGL D3D9: Can't create a depth texture as rendertarget

This topic is 2888 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to create a depth texture to eventually use as a render target, but for the moment I cannot even create a texture with any of the D3DFMT_D* texture formats. I have a reasonably recent card (nvidia 8800, supports DX10.1), so I wouldn't have thought it would be unsupported (I can do it with OpenGL without problem).

Is there any reason why this call should fail?

_d3dDevice->CreateTexture(512, 512, 1, USAGE_RENDERTARGET, D3DFMT_D24X8, D3DPOOL_DEFAULT, &_d3dTexture,NULL);

Direct3D9: (ERROR) :Invalid format specified for texture
Direct3D9: (ERROR) :Failure trying to create a texture

(I have also tried usage as USAGE_DEPTHSTENCIL | USAGE_RENDERTARGET ) with the same result.

It works if I change it to A8R8G8B8. But I've tried D24S8, D32, D16, etc and all just say "Invalid format". Is it just not legal to create a depth texture type render target? I know there are limitations with sampling in the pixel shader, but I thought you could at least render to one.

I'm stumped, any ideas?

I thought to maybe check it with CheckDeviceFormat, but to be honest I don't really understand the parameters, I don't understand the distinction between AdapterFormat and CheckFormat, maybe I just don't really understand what the function is trying to achieve, could be wrong.

If it's not possible to render to a depth texture, is it possible to reasonably quickly blit a depth surface to a D3D9 texture each frame? I'm hoping to avoid having to rewrite all my shaders to support MRT just to get the depth information from a scene (as in writing depth to a generic color channel as well as the depth buffer).

Share this post


Link to post
Share on other sites
Advertisement
Here are relevant bits from my code

#define FOURCC_NULL ((D3DFORMAT)(MAKEFOURCC('N','U','L','L')))
...
LPDIRECT3DTEXTURE9 gDSTex = 0;
LPDIRECT3DSURFACE9 gDSSurf = 0;
LPDIRECT3DSURFACE9 gDummySurf = 0;
...
hr = gD3D9->GetAdapterDisplayMode(D3DADAPTER_DEFAULT, &DisplayMode);
...
D3DFORMAT zFormat = D3DFMT_D24X8;
hr = gD3D9->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, DisplayMode.Format, D3DUSAGE_RENDERTARGET, D3DRTYPE_SURFACE, FOURCC_NULL);
if(FAILED(hr))
{
//ADD YOUR ERROR MSG BOX
return false;
}

hr = gD3D9->CheckDepthStencilMatch(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, DisplayMode.Format, FOURCC_NULL, zFormat);
if(FAILED(hr))
{ //ADD YOUR ERROR MSG BOX
return false;
}
hr = gD3D9Device->CreateRenderTarget(SMAP_SIZE, SMAP_SIZE, FOURCC_NULL, D3DMULTISAMPLE_NONE, 0, FALSE, &gDummySurf, 0);
if(FAILED(hr))
{
//ADD YOUR ERROR MSG BOX
return false;
}
hr = gD3D9Device->CreateTexture(
SMAP_SIZE, SMAP_SIZE, 1, D3DUSAGE_DEPTHSTENCIL, zFormat, D3DPOOL_DEFAULT, &gDSTex, 0);
if(FAILED(hr))
{
//ADD YOUR ERROR MSG BOX
return false;
}
hr = gDSTex->GetSurfaceLevel(0, &gDSSurf);
if(FAILED(hr))
{
//ADD YOUR ERROR MSG BOX
return false;
}
...
hr = gD3D9Device->SetRenderTarget(0, gDummySurf);
hr = gD3D9Device->SetDepthStencilSurface(gDSSurf);
hr = gD3D9Device->Clear(NULL, NULL, CLEAR_FLAGS, 0x00FFFFFF, 1.0f, 0);
DrawDepth();

...
//and then later you set gDSTex as texture in shader and sample from it


and some docs to read.

Share this post


Link to post
Share on other sites
Depth buffer readback as a texture isn't natively supported by D3D9. You can do it with vendor-specific "extensions". See this for ATI and this for Nvidia.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!