32-bits Depth Buffer in DirectX 6 & 7
Hello,
I am new in DirectX and trying to get a depth buffer of 32 - bits.I am using the EnumZBufferFormats method to find the appropriate zBuffer. But, its always giving me 16-Bits z-Buffer. How i can get 32-Z-Buffer.
While on other hand i am able to get 32-Bits Z-Buffer using the OpenGL renderer API on the same machine.
Any help would be appreciated.
Thanks and Regards
Vinod Patel
Can we see some code? And what graphics card do you have? Most graphics cards don't support 32-bit depth buffers in D3D, only 24-bit + 8-bit stencil. It's been a long time since I used DX7, but I'm sure I never had any problems with using 24-bit depth buffers.
Hello,
Here is the snap of the code i have written to get a z-buffer from the callback
// We take the best z buffer depth
HRESULT WINAPI EnumZBufferCallback( DDPIXELFORMAT* pddpf, VOID* _ok )
{
DDPIXELFORMAT *ok = (DDPIXELFORMAT*) _ok;
if( (pddpf->dwFlags == DDPF_ZBUFFER))
{
memcpy(_ok, pddpf, sizeof(DDPIXELFORMAT));
return D3DENUMRET_CANCEL;
}
return D3DENUMRET_OK;
}
and i am calling above callback using below code
DDPIXELFORMAT ddpfZBuffer;
memset(&ddpfZBuffer,0,sizeof(ddpfZBuffer));
m_Direct3D->EnumZBufferFormats( *curDeviceDUID, EnumZBufferCallback, (VOID*) & ddpfZBuffer);
Thanks In Advance
Here is the snap of the code i have written to get a z-buffer from the callback
// We take the best z buffer depth
HRESULT WINAPI EnumZBufferCallback( DDPIXELFORMAT* pddpf, VOID* _ok )
{
DDPIXELFORMAT *ok = (DDPIXELFORMAT*) _ok;
if( (pddpf->dwFlags == DDPF_ZBUFFER))
{
memcpy(_ok, pddpf, sizeof(DDPIXELFORMAT));
return D3DENUMRET_CANCEL;
}
return D3DENUMRET_OK;
}
and i am calling above callback using below code
DDPIXELFORMAT ddpfZBuffer;
memset(&ddpfZBuffer,0,sizeof(ddpfZBuffer));
m_Direct3D->EnumZBufferFormats( *curDeviceDUID, EnumZBufferCallback, (VOID*) & ddpfZBuffer);
Thanks In Advance
That code doesn't look right to me. The flags is a bitmask, so if it contains the Z-buffer bit depth and something else, you skip over it.
Try this instead:
Try this instead:
HRESULT WINAPI EnumZBufferCallback(DDPIXELFORMAT* pddpf, VOID* _ok){ DDPIXELFORMAT*ok = (DDPIXELFORMAT*)_ok; if(pddpf->dwFlags & DDPF_ZBUFFER) { memcpy(_ok, pddpf, sizeof(DDPIXELFORMAT)); return D3DENUMRET_CANCEL; } return D3DENUMRET_OK;}
Sadly, just because OpenGL can do it doesn't mean Direct3D must be able to do it. If your enumeration code is correct and it only reports D16 formats then that's all you can use. You can't force it to use a format that it doesn't support even if you're sure that the hardware is actually capable.
Run some other DX6/DX7 sample code if you can. See if their code enumerates modes differently to yours - that way you can be sure your code is correct and it is in fact the driver that is saying 'no' and not just a bug in your code...
hth
Jack
Run some other DX6/DX7 sample code if you can. See if their code enumerates modes differently to yours - that way you can be sure your code is correct and it is in fact the driver that is saying 'no' and not just a bug in your code...
hth
Jack
Thanks Evil Steve
I got success...Know my method is like...
HRESULT WINAPI EnumZBufferCallback( DDPIXELFORMAT* pddpf, VOID* _ok )
{
DDPIXELFORMAT *ok = (DDPIXELFORMAT*) _ok;
if( (pddpf->dwFlags & DDPF_ZBUFFER))
{
if(ok && (ok->dwZBufferBitDepth < pddpf->dwZBufferBitDepth))
{
memcpy(_ok, pddpf, sizeof(DDPIXELFORMAT));
}
}
return D3DENUMRET_OK;
}
Is it correct know?
I got success...Know my method is like...
HRESULT WINAPI EnumZBufferCallback( DDPIXELFORMAT* pddpf, VOID* _ok )
{
DDPIXELFORMAT *ok = (DDPIXELFORMAT*) _ok;
if( (pddpf->dwFlags & DDPF_ZBUFFER))
{
if(ok && (ok->dwZBufferBitDepth < pddpf->dwZBufferBitDepth))
{
memcpy(_ok, pddpf, sizeof(DDPIXELFORMAT));
}
}
return D3DENUMRET_OK;
}
Is it correct know?
Quote:Original post by vinodpatel2006Looks fine to me. Although remember that you can't use every Z-buffer format with every backbuffer format. I can't remember the process for checking compatibility in D3D7 though.
Is it correct know?
A little off topic, but why are you using D3D7? If you're targetting any OS from the past 10 years, it'll have D3D9 support...
Hello,
I have application that is originally written in DirectX 6.0/7.0. After many years z-buffer problem comes. That problem is minor and comes in some of the animation but client want to fix it.
But please tell me backbuffer will create a problem if z-buffer not support it?
Thanks
I have application that is originally written in DirectX 6.0/7.0. After many years z-buffer problem comes. That problem is minor and comes in some of the animation but client want to fix it.
But please tell me backbuffer will create a problem if z-buffer not support it?
Thanks
Quote:Original post by vinodpatel2006In D3D9 at least, the backbuffer depth usually has to match the Z-buffer depth. You need to check that the depth buffer is compatible with the backbuffer. In D3D9, that's done with IDirect3D9::CheckDepthStencilMatch and similar functions. I've no idea about D3D7 or earlier.
Hello,
I have application that is originally written in DirectX 6.0/7.0. After many years z-buffer problem comes. That problem is minor and comes in some of the animation but client want to fix it.
But please tell me backbuffer will create a problem if z-buffer not support it?
Thanks
If the Z-buffer and backbuffer aren't compatible, you won't be able to create the device.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement