Jump to content
  • Advertisement
Sign in to follow this  
Starcide

Problem with DrawIndexedPrimitive on ATI cards

This topic is 4847 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi This is my first post here and I apologize now for it being so long! I just want to try and explain the situation as best I can, I’ve been developing a game demo for a while now, I’ve accomplished a few things like quadtree frustum culling and collision detection. But a thorn in my side is getting DrawIndexedPrimitive to work properly on ATI cards! Seeing that I only have easy access nVidia hardware this is somewhat of a problem to debug! (Just as some quick background, this part works great) I have a heightmap for my terrain, I use a quadtree to split up the terrain, and collide the view planes with this to get back a list of square areas(defined by min point and max point) that are inside the view frustum. I then loop through the outputted list and draw the parts of the terrain heightmap that correspond to these squares. This is where the problem comes. My index buffer is set up like so, each cell repeating this pattern, I am using triangle lists.
    v3   v4
      |/|
    v1   v2
vertex buffer eg for a 2 x 2 cell grid.
		6---7---8
		|   |   |
		3---4---5
		|   |   |
		0---1---2
This terrain is set up as WidthCells x HeightCells (the above is 2x2) with CellStep being the size of a square cell. My original problem has been that my DIP call either failing on ATI hardware, or it would run at 5fps, where on my card it would run 50-80fps depending on how much of the terrain you could see. Since I have read the part in the FAQ of this forum(and the DIP Demystified article) about how DIP really works and the asserts on ATI hardware that nVidia doesn’t have, I have managed to get it working at 50fps, but now I have been told that the screen flashes a lot of lines everywhere, sounding like the effect you get when you don’t join polygons up together properly. I was hoping that someone could have a look at my code, I have spent a long time checking that I am feeding the right values into DIP as I understand it they should be right, I see no artifacts when I run the game on my Geforce 3 ti200. I am guessnig the problem lies in my calculation of the values for MinIndex and NumVertices as this is all I have been changing, my understanding is that MinIndex is the index of the lowest used vertex in the vertex buffer, so in my case this would be the index of the lower left corner of the current quad being drawn which is MinIndex in my code, and NumVertices is the span of vertices used, so that MinIndex + NumVertices should be equal to the highest vertex location used. Here is the code I am currently using.
int items = CulledTerrainList->GetCount();
	for(int i=0;i<items;i++)
	{
		CurrentNode = CulledTerrainList->Remove();
		//works out the coord of the cell
		int LxPos = (CurrentNode->lx - Terrain.StartX)/Terrain.CellStep;
		int LzPos = (CurrentNode->lz - Terrain.StartZ)/Terrain.CellStep;
		int HxPos = (CurrentNode->hx - Terrain.StartX)/Terrain.CellStep;
		int HzPos = (CurrentNode->hz - Terrain.StartZ)/Terrain.CellStep;
	
		//TERRAIN
		//  *--------------------(WidthCells,HeightCells)
		//  |                            |
		//  |      xSpan--->             |
		//  |	                         |
		//  |	 NODE              ^     |
		//  |	    *------Hx,Hz   |     |
		//  |	    |      |       |     |
		//  |       |      |     zSpan   |
		//  |   Lx,Lz------*             |
		//  |                            |
		//  |                            |
		// (0,0)-------------------------*
	
		int xSpan = HxPos - LxPos;
		int zSpan = HzPos - LzPos;
	
		int IndicesPerCell = 6;//number of indices each cell has in the index buffer
		int VerticesPerCell = 4;

		int VerticesUsed = (Terrain.WidthCells+1)+1;
		int MinVertex = ((LxPos)+((Terrain.WidthCells+1)*LzPos));

		int MinIndex,NumVertices,StartIndex;

		//BOTTOM LEFT FOR REFERENCE
		//d3d_device->DrawIndexedPrimitive(D3DPT_TRIANGLELIST,0,0,0,0,2);

		WORD temp;

		for(int j=0;j<zSpan;j++)
		{
			for(int i=0;i<xSpan;i++)
			{
				MinIndex = (MinVertex+i)+(j*(Terrain.WidthCells+1));
				NumVertices = VerticesUsed;
				StartIndex = (((LzPos+j)*Terrain.WidthCells)//z
									+(LxPos+i)) //x
									*IndicesPerCell;
				temp = Terrain.Floor_Index[StartIndex];
				
				assert(Terrain.Floor_Index[StartIndex] >= MinIndex);
				assert(Terrain.Floor_Index[StartIndex] < (MinIndex + NumVertices));
				
				d3d_device->DrawIndexedPrimitive(D3DPT_TRIANGLELIST,
									0,
									MinIndex,//index in vertex buffer of first vertex								
									NumVertices,//SPAN OF VERTICES USED //could just set as ((Terrain.WidthCells+1)*(Terrain.HeightCells+1)) but not optimal?
									//((Terrain.WidthCells+1)*(Terrain.HeightCells+1)),
									StartIndex,
									2);//2 triangles per cell
			}
		};
	}; 

Share this post


Link to post
Share on other sites
Advertisement
I haven't read you code, but have you tried turning on Direct3D full debug mode (in the DirectX panel) and see if there are any error/warning messages in the output window?

Share this post


Link to post
Share on other sites
Open DirectX properties panel from start->control panel.
Go to Direct3D, switch to debug runtime, and set debug output level to full.
The warning messages (if any) will show up on the output window of your IDE.
From my exp most DrawIndexedPrimitive misuses are detected here.

Share this post


Link to post
Share on other sites
there isn't a DirectX control panel link in my control panel nor on my start menu entry for the DX9 SDK :/

DirectX works fine in visual studio so it's installed properly as far as I know

Share this post


Link to post
Share on other sites
I am also currently facing this exact same problem. I haven't reviewed your code yet, but I believe it has something to do with the number of vertices and polygons passed in the DIP call. I'm assuming you're using triangle strips.

From what I understand, on nVidia cards (which is what I've developed my terrain engine on), each time you call DIP it remembers where the last two vertices it rendered were from the last DIP calls and connects the first vertex you pass in to those last two points. ATI cards don't do this. Therefore, on nVidia cards, the number of primitives to draw is equal to the number of vertices you're passing in, provided you've already rendered at least two vertices. On ATI cards, the number of polygons is equal to NumVertices - 2, like it should be.

This problem is actually only on newer nVidia cards as well. A friend of mine ran a demo of my terrain engine on an old GF2 card, and it performs the same as an ATI card.

I haven't figured out how to solve the problem yet, but I figured if I gave you an idea of exactly what was happening, maybe one of us will be able to find the answer.

Share this post


Link to post
Share on other sites
Huh, you have no directx control panel? O_O

Btw, one thing I've heard: min index and num vertices are meaningful on ATI cards, but not on nVidia cards. It means that on ATI cards, every time you issue an DIP call, the full range of vertices within this range are transformed, so to make sure every vertex is transformed at most once regardless of the indices. nVidia cards don't take care of this
(not sure if this is still true on newer cards).

Looking back at your code, you are only drawing 2 triangles per DIP, so most likely your app is CPU limited (just busy submitting batches to the GPU). More worse is that each DIP call uses num vertices = width of the terrain. On ATI cards this hurts even more (due to the reason above).

I think you should better batch multiple draw calls into one. Using degenerated tris, you may draw the whole terrain using just one call.

Share this post


Link to post
Share on other sites
Quote:
Original post by tjsmith
I'm assuming you're using triangle strips.


I'm using triangle lists as I said in my post, I know this isn't the most efficient way but i was having issues with getting a triangle strips to join together to form a grid for the terrain,I just wanted to get somthing working and so I moved on using tri lists, but thats out of the scope of this thread.

I'm not so sure that our problems are the same, my StartIndex and Primitives values are perfectly correct as far as I can see. I'm just repeatedly drawing a quad made out of 2 triangles from my buffers, joining these together to form square shaped patches of my terrain that my quadtree traversal returns.

Share this post


Link to post
Share on other sites
Quote:
Original post by gamelife
Huh, you have no directx control panel? O_O

nope :-( somthing is amiss somewhere!

Quote:
Btw, one thing I've heard: min index and num vertices are meaningful on ATI cards, but not on nVidia cards. It means that on ATI cards, every time you issue an DIP call, the full range of vertices within this range are transformed, so to make sure every vertex is transformed at most once regardless of the indices. nVidia cards don't take care of this
(not sure if this is still true on newer cards).

yea I know that this is the problem as I mentioned in my first post, my values for MinIndex and NumVertices are the issue

Quote:

Looking back at your code, you are only drawing 2 triangles per DIP, so most likely your app is CPU limited (just busy submitting batches to the GPU). More worse is that each DIP call uses num vertices = width of the terrain. On ATI cards this hurts even more (due to the reason above).

It runs at 50-80fps on my Geforce 3 ti200, so I must be able to make it run at this speed on ATI cards given the correct values for MinIndex and NumVertices?


edit- ok ignore the above I don't think I understood what you ment at first now I think it clicked, while I understood that MinIndex and NumVertices are important on ATI and not nVidia, I've been missing the fact that as you said ATI cards with each call to DIP transform all the vertices between MinIndex and MinIndex+NumVertices, while nVidia cards don't, so with each call to DIP on ATI cards I'm being very wastefull.

maybe I should just have an in depth look at how I'm setting up my vertex and index arrays to make them more ATI friendly as I think thats the problem, maybe using triangle strips too, I think this calls for an entirly new post!

thanks for the help

Share this post


Link to post
Share on other sites
DirectX only appears in the control panel if you chose to install debug runtimes (I'm not sure it's even an option not to install them anymore...), and if your SDK is as new as your retail runtimes. For example, if you installed DX8.1 SDK, then installed a game requiring DX9.0, the game would install newer retail runtimes, and your control panel icon will go away.

Grab the latest DXSDK, and if asked, choose to install the debug runtimes, and you should be set.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!