Jump to content
  • Advertisement


This topic is now archived and is closed to further replies.


Vertex Shader vs Graphics Driver

This topic is 5918 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I''m having trouble with getting my app to run with a specific graphic driver. I wouldn''t normally be worried about this, it''s happened before, but this driver is WHQL certified, which means it should be perfect (right?). Anyway, the D3D8 Caps state that the Vertex Shader Support is less than version 1.0, the previous driver allowed v2.0. The graphics card is a NVidia GForce2, driver v21.32. Here is the initialisation code ...
	d3dpp.BackBufferWidth				= 640;
	d3dpp.BackBufferHeight				= 480;
	d3dpp.BackBufferFormat				= d3ddm.Format;
	d3dpp.BackBufferCount				= 0;
	d3dpp.MultiSampleType				= D3DMULTISAMPLE_NONE;
	d3dpp.SwapEffect					= D3DSWAPEFFECT_FLIP;
	d3dpp.hDeviceWindow					= g_hWnd;
	d3dpp.Windowed						= false;
	d3dpp.AutoDepthStencilFormat		= D3DFMT_D16;
	d3dpp.EnableAutoDepthStencil		= true;
	d3dpp.FullScreen_RefreshRateInHz	= D3DPRESENT_RATE_DEFAULT;
	d3dpp.FullScreen_PresentationInterval = D3DPRESENT_INTERVAL_DEFAULT;

	if (FAILED (g_pD3D->CreateDevice(
		Log(szFun,"FAILED CreateDevice()");
		return false;
	else Log(szFun, "SUCCESS CreateDevice()");

Also, I could only get a 16bit depth buffer. Methinks NVidia is moving the goalposts?? Thanks, TLS

Share this post

Link to post
Share on other sites
1. GeForce2 **DOES NOT** (and never has) do vertex shaders in hardware - you must use software vertex processing or mixed vertex processing if you want to use shaders on a GF2.

I''ve never seen a GeForce driver report shader version 2.0 - ever!. The old ones around the time of the DX8 beta used to report 0.5, but this was related to something which got dropped from the DX8 features (partial shaders).

2.0 vertex shaders will be for DirectX 9 and above only - and considering the beta for that hasn''t even started fully yet seems unlikely to be in production drivers at the moment. Unless you''ve seen this in *PRE PRODUCTION*, *leaked* drivers.

2. All nVidia chips since the TNT have a well documented hardware restriction:

The depth in bits of the depth buffer plus the stencil buffer **MUST** be the same as the depth of the frame buffer or render target.

I assume that d3ddm.Format is the format of the desktop - your desktop is probably in 16bit.

To test for this you should be using CheckDepthStencilMatch() and CheckDeviceFormat(). It was for cards such as this that these calls were added to the API.

BTW: the debug spew from the debug runtime should have told you about that restriction too.

Simon O''Connor
Creative Asylum Ltd

Share this post

Link to post
Share on other sites
Thanks very much Simon,

You are of course correct

I have been away for a while, and forgot the VS version numbers, I think it''s v1.1(?) now.

Also correct about the depth buffer.

I''m about as close to pre-release drivers as I am to Alpha-Centauri

Now if I can just work out how to synchronize the exact time between two networked computers, I''ll be moving...


Share this post

Link to post
Share on other sites

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!