Sign in to follow this  
Dookie

Index Buffer and Rendering question

Recommended Posts

Dookie    290
Hey guys, here's another one that's got me to scratching a bald spot on my head. First of all, here's how I'm creating my Direct3D device:
if( FAILED( g_pD3D->CreateDevice( D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, hWnd,
		D3DCREATE_HARDWARE_VERTEXPROCESSING,
		&d3dpp, &g_pd3dDevice)))
{
	return;
}
or
if( FAILED( g_pD3D->CreateDevice( D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, hWnd,
		D3DCREATE_SOFTWARE_VERTEXPROCESSING,
		&d3dpp, &g_pd3dDevice)))
{
	return;
}
(the difference is in the HARDWARE and SOFTWARE Vertex Processing Flags) I'm using this to create a vertex buffer and index buffer for my 2D game:
if( FAILED( g_pd3dDevice->CreateVertexBuffer( (SP_END*4)*sizeof(CUSTOMVERTEX),
			0, D3DFVF_CUSTOMVERTEX,
			D3DPOOL_MANAGED, &g_QuadsVB, NULL ) ) )
{
	return E_FAIL;
}
if( FAILED( g_pd3dDevice->CreateIndexBuffer ((SP_END*2*3)*sizeof(WORD), 
			DX_VertexHandlingMode, D3DFMT_INDEX16,
			D3DPOOL_MANAGED, &g_QuadsIB, NULL)))
{
	return E_FAIL;
}
g_pd3dDevice->SetIndices(g_QuadsIB);
FillIndexBuffer();
if( FAILED( g_QuadsVB->Lock( 0, sizeof(quadVerts), (void**)&pVertices, 0 ) ) )
{
	return E_FAIL;
}
memcpy( pVertices, quadVerts, sizeof(quadVerts) );
g_QuadsVB->Unlock();

where SP_END is the last quad in the game (1100). The FillIndexBuffer() function you see in the above code is this right here:
void FillIndexBuffer()
{
	int index = 0;
	short* indices = NULL;

	// All ingame entities...
	g_QuadsIB->Lock(0, (SP_END*2*3)*sizeof(WORD), (void**) &indices, 0);
	for (int vertex = 0; vertex < SP_END*4; vertex += 4)
	{
		indices[index] = vertex;
		indices[index + 1] = vertex + 2;
		indices[index + 2] = vertex + 3;
		indices[index + 3] = vertex;
		indices[index + 4] = vertex + 1;
		indices[index + 5] = vertex + 2;
		index += 6;
	}
	g_QuadsIB->Unlock();
}
After I get the buffers created and filled, I render parts of the vertex buffer within my render loop like so:
void DrawMSQuads()
{
	UINT vNum = (MS_END-MS_START) * 6;
	UINT pNum = vNum / 3;

	// Set up the texture...
	g_pd3dDevice->SetRenderState( D3DRS_SRCBLEND, D3DBLEND_ONE );
	g_pd3dDevice->SetRenderState( D3DRS_DESTBLEND, D3DBLEND_ZERO );
	g_pd3dDevice->SetTexture( 0, Texture );

	// Now draw the primitives from the MasterGroup struct
	g_pd3dDevice->SetStreamSource( 0, g_QuadsVB, 0, sizeof(CUSTOMVERTEX) );
	g_pd3dDevice->SetIndices(g_QuadsIB);

	g_pd3dDevice->DrawIndexedPrimitive (D3DPT_TRIANGLELIST,
			(MS_START)*4,
			0,
			vNum,
			0,
			pNum);
}
where MS_START is 923 and MS_END is 987. This works for the most part, but some ranges of vertices won't render under certain situations. For example, the above function renders all the time. But the following won't render on a GeForce 6800 video card when the Direct3D device is created using the D3DCREATE_SOFTWARE_VERTEXPROCESSING flag:
void DrawMenuTextQuads()
{
	UINT vNum = (MT_END-MT_START) * 6;
	UINT pNum = vNum / 3;

	// Set up the texture...
	g_pd3dDevice->SetRenderState( D3DRS_SRCBLEND, D3DBLEND_ONE );
	g_pd3dDevice->SetRenderState( D3DRS_DESTBLEND, D3DBLEND_ZERO );
	g_pd3dDevice->SetTexture( 0, Texture );

	// Now draw the primitives from the MasterGroup struct
	g_pd3dDevice->SetStreamSource( 0, g_QuadsVB, 0, sizeof(CUSTOMVERTEX) );
	g_pd3dDevice->SetIndices(g_QuadsIB);

	g_pd3dDevice->DrawIndexedPrimitive (D3DPT_TRIANGLELIST,
			(MT_START)*4,	// Starts at index (MT_START * 4)
			0,
			vNum,
			0,
			pNum);
}
MT_START is 603 and MT_END is 923. This function WILL render correctly on NVidia cards, however, if the Direct3D device is created using the D3DCREATE_HARDWARE_VERTEXPROCESSING flag. Any ideas why the DrawMenuTextQuads() function won't render on a NVidia video card when the Direct3D device is created using the D3DCREATE_SOFTWARE_VERTEXPROCESSING flag? I hope you can help me out on this, because I don't want to scratch my bald spot any bigger!

Share this post


Link to post
Share on other sites
sirob    1181
Try using the Debug Runtimes to determine what the issue is.

Most likely, one of the parameters supplied to DrawIndexedPrimitive is incorrect. While the Hardware VP device can overcome this issue, and does so, the Software VP cannot, and therefore the draw call fails.

Using the Debug Runtimes would introduce new checks to make sure all the parameters are correct, and will cause the call to fail if any parameter is incorrect, spitting out the appropriate error message.

I also noticed you don't test any of the return codes to any of your drawing methods, which isn't recommended. One of the previous D3D calls might be silently failing, causing the Draw call to fail as well.

Hope this helps.

Share this post


Link to post
Share on other sites
Dookie    290
Thanks for the info, sirob!

I don't have debug on the NVidia computer, only my dev computer (ATI-based). And DrawMenuTextQuads() works with either flag on my dev computer...

I'm probably going to leave it as-is, since it works fine when using the hardware flag. But it's still going to bug me to no end as to why this error is happening... It's weird, I can usually test this problem by lowering the number going to the DrawIndexedPrimitive() function by doing this...
// UINT vNum = (MT_END-MT_START) * 6;

UINT vNum = 16 * 6; // Only draws sixteen text quads

That would take care of a 'Not Enough Vertices' error, but I guess this is a different problem since the error happens on the NVidia computer regardless of this change. Pretty weird stuff!

Share this post


Link to post
Share on other sites
Dookie    290
OK, now I'm really confused...

void DrawMenuText()
{
HRESULT hr;
char lStr[64];

UINT vNum = (MT_END-MT_START) * 6;
UINT pNum = vNum / 3;

// Set up the texture...
g_pd3dDevice->SetRenderState( D3DRS_SRCBLEND, D3DBLEND_ONE );
g_pd3dDevice->SetRenderState( D3DRS_DESTBLEND, D3DBLEND_ZERO );
g_pd3dDevice->SetTexture( 0, Texture );

// Now draw the primitives from the MasterGroup struct
g_pd3dDevice->SetStreamSource( 0, g_QuadsVB, 0, sizeof(CUSTOMVERTEX) );
g_pd3dDevice->SetIndices(g_QuadsIB);

hr = g_pd3dDevice->DrawIndexedPrimitive (D3DPT_TRIANGLELIST,
(MT_START)*4, // Starts at index (MT_START * 4)
0,
vNum,
0,
pNum);

if (hr)
{
wsprintf(lStr, "%s", " Error in DrawMenuText():");
WriteLogInfo(lStr);
wsprintf(lStr, "%s%i", " Error number: ", hr);
WriteLogInfo(lStr);
PostQuitMessage( 0 );
}
}


This returns no errors, but nothing renders on a NVidia card with the software vertex handling flag. It works, but still no text. Weird! I'm gonna do more tests and see what happens, I'll be back with my results...

Share this post


Link to post
Share on other sites
Dookie    290
More thorough tests, and still no results...

void DrawMenuText()
{
HRESULT hr;
char lStr[64];

UINT vNum = (MT_END-MT_START) * 6;
UINT pNum = vNum / 3;

// Set up the texture...
hr = g_pd3dDevice->SetRenderState( D3DRS_SRCBLEND, D3DBLEND_ONE );
if (hr)
{
wsprintf(lStr, "%s", " Error in DrawMenuText (SetRenderState1):");
WriteLogInfo(DXStuff, lStr);
wsprintf(lStr, "%s%i", " Error number: ", hr);
WriteLogInfo(DXStuff, lStr);
PostQuitMessage( 0 );
}


hr = g_pd3dDevice->SetRenderState( D3DRS_DESTBLEND, D3DBLEND_ZERO );
if (hr)
{
wsprintf(lStr, "%s", " Error in DrawMenuText (SetRenderState2):");
WriteLogInfo(DXStuff, lStr);
wsprintf(lStr, "%s%i", " Error number: ", hr);
WriteLogInfo(DXStuff, lStr);
PostQuitMessage( 0 );
}


hr = g_pd3dDevice->SetTexture( 0, Texture );
if (hr)
{
wsprintf(lStr, "%s", " Error in DrawMenuText (SetTexture):");
WriteLogInfo(DXStuff, lStr);
wsprintf(lStr, "%s%i", " Error number: ", hr);
WriteLogInfo(DXStuff, lStr);
PostQuitMessage( 0 );
}


// Now draw the primitives from the MasterGroup struct
hr = g_pd3dDevice->SetStreamSource( 0, g_QuadsVB, 0, sizeof(CUSTOMVERTEX) );
if (hr)
{
wsprintf(lStr, "%s", " Error in DrawMenuText (SetStreamSource):");
WriteLogInfo(DXStuff, lStr);
wsprintf(lStr, "%s%i", " Error number: ", hr);
WriteLogInfo(DXStuff, lStr);
PostQuitMessage( 0 );
}


hr = g_pd3dDevice->SetIndices(g_QuadsIB);
if (hr)
{
wsprintf(lStr, "%s", " Error in DrawMenuText (SetIndices):");
WriteLogInfo(DXStuff, lStr);
wsprintf(lStr, "%s%i", " Error number: ", hr);
WriteLogInfo(DXStuff, lStr);
PostQuitMessage( 0 );
}


hr = g_pd3dDevice->DrawIndexedPrimitive (D3DPT_TRIANGLELIST,
(MT_START)*4, // Starts at index (MT_START * 4)
0,
vNum,
0,
pNum);

if (hr)
{
wsprintf(lStr, "%s", " Error in DrawMenuText (DrawIndexedPrimitive):");
WriteLogInfo(DXStuff, lStr);
wsprintf(lStr, "%s%i", " Error number: ", hr);
WriteLogInfo(DXStuff, lStr);
PostQuitMessage( 0 );
}
}


The above function operates on a NVidia machine, but doesn't render anything when the software vertex handling flag is used when creating the Direct3D device. All other similar functions (there's about ten of them) work perfectly. Weird!

Share this post


Link to post
Share on other sites
vNum is not Quads*6, it's Quads*4. PNum is vnum/2 or quads*2.

nVidia cards ignore the MinIndex, numVertices, which is why it rendered fine on nVidia with hardware processing. Software processing, REF, and ATI all want these values to be right.

Share this post


Link to post
Share on other sites
Dookie    290
Wow Namethatnobodyelsetook, that worked pretty well on my ATI-based dev computer! I can't test it on a NVidia computer tonight, but I'll give it a try tomorrow. I sorta figured it had something to do with how the verts were handled in the Draw...() functions, and I'm reallyglad you spotted the error!

I'll let you know how it worked when I try it out tomorrow. Thanks again! [smile]

Share this post


Link to post
Share on other sites
Evil Steve    2017
Either NVidia or ATI - I think ATI ignores a couple of parameters to DrawIndexedPrimitive (The two which refer to the range of vertices to transform - NumVertices and something else - don't have the docs handy here), where the other checks them. That means that the code will work on one card - even though it's incorrect - and fail on the other card. It should also fail when using software vertex processing, since D3D will no doubt use the parameters for performance reasons.

Not sure if it's related here though.

Share this post


Link to post
Share on other sites
Dookie    290
Well crud, DrawMenuTextQuads() still doesn't render anything on the nVidia computer when software vertex processing is enabled... But changing the code did enhance the performance of the game a little bit![smile] Is this what you meant, Namethatnobodyelsetook?

void DrawMenuText()
{
// UINT vNum = (MT_END-MT_START) * 6;
// UINT pNum = vNum / 3;
UINT vNum = (MT_END-MT_START) * 4;
UINT pNum = vNum / 2;

// Set up the texture...
g_pd3dDevice->SetRenderState( D3DRS_SRCBLEND, D3DBLEND_ONE );
g_pd3dDevice->SetRenderState( D3DRS_DESTBLEND, D3DBLEND_ZERO );
g_pd3dDevice->SetTexture( 0, Texture );

// Now draw the primitives from the MasterGroup struct
g_pd3dDevice->SetStreamSource( 0, g_QuadsVB, 0, sizeof(CUSTOMVERTEX) );
g_pd3dDevice->SetIndices(g_QuadsIB);

g_pd3dDevice->DrawIndexedPrimitive (D3DPT_TRIANGLELIST,
(MT_START)*4, // Starts at index (MT_START * 4)
0,
vNum,
0,
pNum);
}

Wonder what else could be wrong? What's making this a really tough problem to crack is that it's not throwing any errors, it's just not rendering anything. This is really strange!

Do my CreateVertexBuffer() and CreateIndexBuffer() look good? By the way, "DX_VertexHandlingMode = D3DUSAGE_SOFTWAREPROCESSING" when the Direct3D device is set to use software vertex processing, and "DX_VertexHandlingMode = D3DUSAGE_WRITEONLY" when the Direct3D device is set to use hardware vertex processing. How about my FillIndexBuffer() function, do all the variables and math look good?

Share this post


Link to post
Share on other sites
Dookie    290
Dunno if this will help, but here's the ranges for my vertex/index buffer...

#define PT_START		0	// <<=== Should always be FIRST
#define PT_END 256 // Particles...
//
#define TX_START 256 // Text...
#define TX_END 416
//
#define SP_START 416 // Sprites...
#define SP_LAST 544
//
#define BG_QUAD 545 // One background quad (starfield, non-moving)
#define FD_QUAD 546 // One fade quad
//
#define BG_STAR_F 547 // Four background quads (starfield, non-moving) BG_STARS
#define BG_STAR_L 550 // Four background quads (starfield, non-moving)
#define BG_NEBSTART 551 // Nebulae particles Start (moves and rotates)
#define BG_NEBEND 583 // Nebulae particles End (moves and rotates)
#define BG_MOON 584 // Background moon (moves behind Planet)
#define BG_PLANET 585 // Background planet (rotates)
#define BG_PSHADOW 586 // Planet shadow (non-moving)
#define BG_START 587 // Background sprites Start (moves and rotates)
#define BG_END 602 // Background sprites End (moves and rotates)
//
#define MT_START 603 // Menu text (320)
#define MT_END 923
//
#define MS_START 923 // Menu sprites, normal (64)
#define MS_END 987
//
#define MP_START 987 // Menu sprites, transparent (48)
#define MP_END 1035
//
#define QD_TRANSITION 1036 // Used to transition between Menu and Game
//
#define SP_END 1037 // <<=== Should always be LAST

The only range that won't render on a nVidia vid card while the Direct3D device is created using the D3DCREATE_SOFTWARE_VERTEXPROCESSING flag is this:
#define MT_START		603	// Menu text (320)
#define MT_END 923

Everything else renders perfectly, everything before and after the MT_START-MT_END range. I tried changing the count (64 instead of 320), moving it to a different range (swapped SP_ range for MT_ range), copying the DrawSprites() code into the DrawMenuText() function and only changing the MT_ variables, and the order in which the menu text is getting rendered, and the MT_ range still won't render. VERY strange indeed! I hope one of you can catch something in my code that I just can't see.

Share this post


Link to post
Share on other sites
Rompa    307
Quote:
Original post by Evil Steve
Either NVidia or ATI - I think ATI ignores a couple of parameters to DrawIndexedPrimitive (The two which refer to the range of vertices to transform - NumVertices and something else - don't have the docs handy here), where the other checks them.
You are indeed correct in that one seems to ignore the number of vertices, except in my experience its nVidia whereas ATI requires the number of vertices to be correct. The poster should definitely run the debug runtime with max-1 error level reporting - it will pretty much tell him what's going wrong, at least it did for me a while back.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this