Archived

This topic is now archived and is closed to further replies.

krad7

confused regarding dx Performance

Recommended Posts

krad7    122
I am really confused regarding the performance of my terrain engine. My terrain is of fixed size (NxN patches and each patch is made of 33x33 vertices) and i create Vertex buffers for each patch (so i have a total of N * N VB''s) and one common IB.. And i am using 2 textures for my terrain (texture splatting). Code to create my VB and IB are:
g_Device9->CreateVertexBuffer(
NUM_PATCH_VERTICES * sizeof(TerrainVertex),	D3DUSAGE_WRITEONLY,TERRAIN_PATCH_FVF_VERTEX,D3DPOOL_DEFAULT, 	&m_VB,0);
g_Device9->CreateIndexBuffer(
m_uiNumberOfIndices * sizeof(WORD), 
D3DUSAGE_WRITEONLY,D3DFMT_INDEX16,D3DPOOL_DEFAULT,&m_IB,0);
So both are static ! Now i create a terrain of size 16x16 patches(not vertices, 16x16 patches and each patch is made of 33x33 vertices) and initially before frustrum culling i just used brute force to display my terrain, i.e. i display every patch !:
for(i=0;i <m_uiLandscapeLength;i++)
	{
		for(j=0;j <m_uiLandscapeLength;j++)
		{
			terrainPatch[i][j].Render();
		}
	}
I got around 75-80 FPS. now i increased the m_uiLandscapeLength to 32 (so its 32x32 patches) i thought i''ll have a huge performance hit, but nope, still the FPS was around 75-80 ! then i reduced the number of patches to 5x5, it was still the same ! (for m_uiLandscapeLength = 1 to 4, the performance increased). Now i am wondering if frustrum culling would make any difference at all ! My system specs are P4 1.9 GHz, Geforce 2 Mx 64MB and i have 512MB ram. I dont know why this would happen. is it because Dx automatically culls off the terrain unseen using the view port and projection setting?? (there has to be some performance hit bcos of the for loop right?) Thanks !

Share this post


Link to post
Share on other sites
krad7    122
quote:
Original post by Erzengeldeslichtes
It sounds like your monitor''s refresh rate is 80hz and you set your device to sync (check the FAQ about that). You should be getting more in the range of 800 FPS with the small terrains.

I doubt that my device is set to syn with the refresh rate, becos when i try to display 3x3 patches, the FPS is around 110-120. and these are my D3DPRESENT_PARAMETERS:

d3dPP.BackBufferCount = 1;
D3DDISPLAYMODE d3ddm;
g_D3D9->GetAdapterDisplayMode(D3DADAPTER_DEFAULT, &d3ddm);
d3dPP.BackBufferFormat = d3ddm.Format;//GetColor(m_bpp);

d3dPP.BackBufferWidth = m_width;
d3dPP.BackBufferHeight = m_height;
d3dPP.hDeviceWindow = g_HMainWnd;
d3dPP.SwapEffect=D3DSWAPEFFECT_DISCARD ;
d3dPP.Flags= D3DPRESENTFLAG_DISCARD_DEPTHSTENCIL;
d3dPP.PresentationInterval= D3DPRESENT_INTERVAL_IMMEDIATE;

Thanks

Share this post


Link to post
Share on other sites
circlesoft    1178
terrainPatch[ i ][j].Render();

How many vertices are being rendered each time through this function? Also, how are you rendering it (specifically)? Making a lot of calls to DrawPrimitive() is definetly going to slow you down.

All in all, I wouldn't really worry about it, considering that you have a GF2mx 64MB graphics card. Put it on a newer card, and you'll probably get the results that Erzengeldeslichtes is talking about.


Dustin Franklin
Mircrosoft DirectX MVP

[edited by - circlesoft on March 27, 2004 12:14:06 AM]

Share this post


Link to post
Share on other sites
krad7    122
My question is i am not quite able to understand the performance of my terrain engine.. irrespective of how many DIP''s i use (either 25 or 1024 DIP calls to render 2048 tri''s), the FPS seems to be constant at around 75-80.. so like Erzengeldeslichtes said my device is in sync with vsync hence i am not able to get higher fps values..
Thanks

Share this post


Link to post
Share on other sites
JohnBolton    1372
You are rendering 1024 patches of 2048 triangles at 75 frames per second on a GF2MX? That is 150M triangles/second, but the GF2MX only does 20M-25M triangles/second.

Share this post


Link to post
Share on other sites