• Advertisement

Archived

This topic is now archived and is closed to further replies.

This is not work :-(

This topic is 5553 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

this is saying that my card doesnt have 16bit depth buffer support, when im pretty sure it does...
      
// 16 bit depth buffer

if(FAILED(pd3d->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_A8R8G8B8, 
D3DUSAGE_DEPTHSTENCIL, D3DRTYPE_SURFACE, D3DFMT_D16)))
{
	MessageBox(hMainWnd, "16 bit depth buffer unsupported. Engine cannot function without this.", "Volatile Engine: Error", MB_ICONERROR|MB_OK);
	pCSystem->Log("ERROR: 16 bit depth buffer unsupported.");
	pCSystem->Log("Engine cannot function without this.");
			return false;
}
else pCSystem->Log("CAPS: 16 bit depth buffer supported.");      
edit: fixed comment thing [edited by - elis-cool on December 7, 2002 2:04:19 AM]

Share this post


Link to post
Share on other sites
Advertisement
uhh, the check is commented off. that would make it display the error whether your card supported it or not

Share this post


Link to post
Share on other sites
The error is that you are using D3DFMT_D16, which specifies a 16-bit depth buffer, not a stencil buffer. You need D3DFMT_D15S1, which specifies a 16-bit buffer, 15 bits for depth and 1 bit for stencil.

~CGameProgrammer( );

EDIT: But niyaw is right in that you should use X8R8G8B8 instead of A8R8G8B8. Many cards do not support the latter. 32-bit color depth doesn't really use the highest byte, usually, but the reason to use that mode over 24-bit mode is that 32-bit mode is DWORD-aligned, so working with it is faster if you have enough video memory.

[edited by - CGameProgrammer on December 7, 2002 1:09:42 AM]

Share this post


Link to post
Share on other sites
And BTW what you are trying to set is 32 bits

16 is D3DFMT_R5G6B5.
If you want 32 bits, use D3DFMT_X8R8G8B8.

Some vidoe cards do not support 32 bits for 3D, such as 3DFX and Intel cards but all support 16 bits.

______________________________
Oooh, you found the horadric cube!

Share this post


Link to post
Share on other sites
quote:
Original post by billybob
uhh, the check is commented off. that would make it display the error whether your card supported it or not

its not like that in the actual code, it just got fucked in the formating in here.

CGameProgrammer: I dont want stencil, but you have to use D3DUSAGE_DEPTHSTENCIL if your checking for either depth or stencil support, the only other option is D3DUSAGE_RENDERTARGET.

Coincoin & niyaw: But I need 32bit color for like alpha textures and stuff... I thought the depth buffer bit count was completely unrelated to the render target bit depth...


Share this post


Link to post
Share on other sites
surely you need alpha for textures, but you don't (generally) need it in backbuffer.

edit: what makes you think the call is failing because of invalid depth/stencil format? what does debug runtime say?

[edited by - niyaw on December 7, 2002 2:20:19 AM]

Share this post


Link to post
Share on other sites
quote:
Original post by CGameProgrammer
the depth buffer format is unrelated to the screen bit depth.


however, nvidia recommends matching sizes of rendertarget and depthstencil for optimal performance.

[edited by - niyaw on December 7, 2002 7:10:26 AM]

Share this post


Link to post
Share on other sites
quote:
however, nvidia recommends matching sizes of rendertarget and depthstencil for optimal performance


They actually *require* the bit depths to be the same, every chip since TNT has had that requirement. Newer drivers have an option to force the depths to be the same if they don''t match, but this might be set to off on some peoples machines...

They also *recommend* that the the width & height of the depth buffer for render targets be the same size as the the target width & height. Other IHVs recommend the same.


The Zbuffer depth == Framebuffer depth requirement is the reason for the CheckDepthStencilMatch() function in the API. CheckDeviceFormat() only tells you whether a particular format is available on the chip, not that it will actually work. Some drivers may also perform frame buffer depth checks in there too as another place to trap bad combinations.

Take a look at how the sample framework handles this


--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
That''s the driver stuff I was talking about. The sample asked D3D for a 16bit depth buffer and a 32bit frame buffer. D3D then passed that request to the driver which internally changed the depth format because it didn''t match!. D3D just got an OK code back from the driver so is none the wiser.

Check the documentation at the nVidia site or chat to your friendly devrel rep if you don''t believe me

[http://www.nvidia.com/view.asp?IO=pacman - it was certainly the case when I wrote the engine for that back in 1999, and still is as far as I''m aware ]

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
OT:

quote:
Original post by S1CA
[http://www.nvidia.com/view.asp?IO=pacman - it was certainly the case when I wrote the engine for that back in 1999, and still is as far as I''m aware ]



was a lot of fun to play btw.

Share this post


Link to post
Share on other sites

quote:
was a lot of fun to play btw.


Glad you enjoyed it. Thanks!

Just a shame it didn''t get the European marketing it deserved. Hopefully Infogrames will see sense and re-release it on budget.

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
Ok, thanks guys. Ok... so I was now wondering what is the minimum spec video card that has 32bit depth buffer support? Because I know my nVidia Vanta 8MB doesnt So it would be good to know what cards will be the minimum I need to develop for...

Share this post


Link to post
Share on other sites
quote:
Original post by elis-cool
Ok, thanks guys. Ok... so I was now wondering what is the minimum spec video card that has 32bit depth buffer support? Because I know my nVidia Vanta 8MB doesnt So it would be good to know what cards will be the minimum I need to develop for...


I''m sure the TNT Vanta *does* support 32bit depth buffers. Just with the restriction that the frame buffer must also be 32bit.

BTW: when I say 32bit depth buffer I actually mean depth+stencil=32.

Actually looking at your code snippet again, I think I see why that call fails, as Coincoin mentioned: D3DFMT_A8R8G8B8

Most older chips DO NOT support alpha channels in the frame buffer - use the caps viewer which comes with the SDK to check which frame buffer (/back buffer) formats your graphics card ACTUALLY supports. I seriously doubt D3DFMT_A8R8G8B8 is one of them (very early nVidia unified drivers used to report that wrongly though). D3DFMT_X8R8G8B8 is much more likely to have support.

Of course with 32bit modes, you need to be sure there is actually the video memory to support what you want to do, and that the double memory usage doesn''t kill your framerate.

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
Ok, so I checked the caps... it doesnt support D3DFMT_A8R8G8B8...
But I know other games and stuff ive played have used alpha textures for sprites and stuff... what the hell??

Share this post


Link to post
Share on other sites
i think there is a 16 bit A4R4G4B4 or something like that, and a 16 bit A1R5G5B5 one too. i''m not sure about the 4444 one though

Share this post


Link to post
Share on other sites
quote:

16bit depth buffer support



Ooops ... Sorry, I tough you were trying to get 16 color buffer I didn''t read correctly.

______________________________
Oooh, you found the horadric cube!

Share this post


Link to post
Share on other sites
quote:
Original post by elis-cool
Ok, so I checked the caps... it doesnt support D3DFMT_A8R8G8B8...
But I know other games and stuff ive played have used alpha textures for sprites and stuff... what the hell??

you use alpha for texture format, but not for backbuffer format.

Share this post


Link to post
Share on other sites
hmmn but doesnt the backbuffer need alpha so you blend colors? if not then whats the point of alpha in the backbuffer?

Share this post


Link to post
Share on other sites
quote:
Original post by elis-cool
hmmn but doesnt the backbuffer need alpha so you blend colors?


no. the common SRCALPHA/INVSRCALPHA blending, for instance, only requires alpha to be present in the texture. ONE/ONE additive blending doesn''t use destination alpha either.
quote:

if not then whats the point of alpha in the backbuffer?

there is none.

unless, of course, you''re doing something really special.

Share this post


Link to post
Share on other sites

  • Advertisement