Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

elis-cool

This is not work :-(

This topic is 5735 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

this is saying that my card doesnt have 16bit depth buffer support, when im pretty sure it does...
      
// 16 bit depth buffer

if(FAILED(pd3d->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_A8R8G8B8, 
D3DUSAGE_DEPTHSTENCIL, D3DRTYPE_SURFACE, D3DFMT_D16)))
{
	MessageBox(hMainWnd, "16 bit depth buffer unsupported. Engine cannot function without this.", "Volatile Engine: Error", MB_ICONERROR|MB_OK);
	pCSystem->Log("ERROR: 16 bit depth buffer unsupported.");
	pCSystem->Log("Engine cannot function without this.");
			return false;
}
else pCSystem->Log("CAPS: 16 bit depth buffer supported.");      
edit: fixed comment thing [edited by - elis-cool on December 7, 2002 2:04:19 AM]

Share this post


Link to post
Share on other sites
Advertisement
The error is that you are using D3DFMT_D16, which specifies a 16-bit depth buffer, not a stencil buffer. You need D3DFMT_D15S1, which specifies a 16-bit buffer, 15 bits for depth and 1 bit for stencil.

~CGameProgrammer( );

EDIT: But niyaw is right in that you should use X8R8G8B8 instead of A8R8G8B8. Many cards do not support the latter. 32-bit color depth doesn't really use the highest byte, usually, but the reason to use that mode over 24-bit mode is that 32-bit mode is DWORD-aligned, so working with it is faster if you have enough video memory.

[edited by - CGameProgrammer on December 7, 2002 1:09:42 AM]

Share this post


Link to post
Share on other sites
And BTW what you are trying to set is 32 bits

16 is D3DFMT_R5G6B5.
If you want 32 bits, use D3DFMT_X8R8G8B8.

Some vidoe cards do not support 32 bits for 3D, such as 3DFX and Intel cards but all support 16 bits.

______________________________
Oooh, you found the horadric cube!

Share this post


Link to post
Share on other sites
quote:
Original post by billybob
uhh, the check is commented off. that would make it display the error whether your card supported it or not

its not like that in the actual code, it just got fucked in the formating in here.

CGameProgrammer: I dont want stencil, but you have to use D3DUSAGE_DEPTHSTENCIL if your checking for either depth or stencil support, the only other option is D3DUSAGE_RENDERTARGET.

Coincoin & niyaw: But I need 32bit color for like alpha textures and stuff... I thought the depth buffer bit count was completely unrelated to the render target bit depth...


Share this post


Link to post
Share on other sites
surely you need alpha for textures, but you don't (generally) need it in backbuffer.

edit: what makes you think the call is failing because of invalid depth/stencil format? what does debug runtime say?

[edited by - niyaw on December 7, 2002 2:20:19 AM]

Share this post


Link to post
Share on other sites
quote:
Original post by CGameProgrammer
the depth buffer format is unrelated to the screen bit depth.


however, nvidia recommends matching sizes of rendertarget and depthstencil for optimal performance.

[edited by - niyaw on December 7, 2002 7:10:26 AM]

Share this post


Link to post
Share on other sites
quote:
however, nvidia recommends matching sizes of rendertarget and depthstencil for optimal performance


They actually *require* the bit depths to be the same, every chip since TNT has had that requirement. Newer drivers have an option to force the depths to be the same if they don''t match, but this might be set to off on some peoples machines...

They also *recommend* that the the width & height of the depth buffer for render targets be the same size as the the target width & height. Other IHVs recommend the same.


The Zbuffer depth == Framebuffer depth requirement is the reason for the CheckDepthStencilMatch() function in the API. CheckDeviceFormat() only tells you whether a particular format is available on the chip, not that it will actually work. Some drivers may also perform frame buffer depth checks in there too as another place to trap bad combinations.

Take a look at how the sample framework handles this


--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!