Sign in to follow this  
godsenddeath

Triangle Grid Problem

Recommended Posts

I'm generating terrain using a triangle grid, which I've done before, and I'm doing it using the exact same code, but for some reason, when the grid gets to around 280 x 280, dozens of extra lines appear. I've manually done the math for the vertices and indices and compared it to the vertices and indices for the grid, and it matches exactly. Does anyone have an idea of what could be happening? here are screen shots of the correct behaviour, incorrect behaviour, and the vertex/index generation code.
Correct
Incorrect generate vertex grid
	float StartX = (float)-m_Size * (m_Width/2.0f); //m_Size = distance between points, m_Width = width in vertices
	float StartZ = (float)m_Size * (m_Height/2.0f);//m_Height = height in vertices

	float EndX = -StartX;
	float EndZ = -StartZ;



	//rows
	for(float i = StartZ;i>EndZ;i-=m_Size)
	{
		//columns
		for(float j = StartX;j<EndX;j += m_Size)
		{
			
			vertexList.push_back(PCVertex(j,0,i,D3DCOLOR_XRGB(255,0,0),device));
		}

	}


generate indices

	for(int i = 0;i<height;i++)
	{
		for(int j = 0;j<width;j++)
		{
			
			indexList.push_back((i * m_Width) + j);
			indexList.push_back((i * m_Width) + (j + 1));
			indexList.push_back(((i + 1) * m_Width) + j);
			
			indexList.push_back((i * (m_Width)) + (j + 1));
			indexList.push_back(((i + 1) * m_Width) + (j + 1));
			indexList.push_back(((i + 1)* m_Width) + j);
			
			
		}

	}


Share this post


Link to post
Share on other sites
I'm not very well versed in D3D but it could be that you're seeing problems at 280x280 due to overflowing the index list with more than 65536 indicies.

I'm not sure if D3D is limited in this way at all (really doubtful given the right information) but perhaps the data type that you're passing to the D3D rendering fucnctions is.

I could be totally off here tho, given my lack of knowledge of D3D. (More of an OpenGL guy here).

Share this post


Link to post
Share on other sites

for(int i = 0;i<height;i++)
{
for(int j = 0;j<width;j++)
{
indexList.push_back((i * m_Width) + j);
indexList.push_back((i * m_Width) + (j + 1)); //out of bounds when j == width-1
indexList.push_back(((i + 1) * m_Width) + j); //out of bounds when i == height-1

indexList.push_back((i * (m_Width)) + (j + 1)); //out of bounds when j == width-1
indexList.push_back(((i + 1) * m_Width) + (j + 1)); //out of bounds when i == height-1
//out of bounds when j == width-1
indexList.push_back(((i + 1)* m_Width) + j); //out of bounds when i == height-1
}
}





in many of those cases you go out of bounds when i and j are max value. you create indices pointing to a value at [height][width]

[EDIT: I should clarify that you're only technically "out of bounds of the array" when i == height - 1 && j == width - 1. In the other cases where just one of those is true, you're still in the memory bounds of your array but you are most likely pointing to the wrong vertex.]

-me

Share this post


Link to post
Share on other sites
Quote:
Original post by Palidine
*** Source Snippet Removed ***

in many of those cases you go out of bounds when i and j are max value. you create indices pointing to a value at [height][width]

[EDIT: I should clarify that you're only technically "out of bounds of the array" when i == height - 1 && j == width - 1. In the other cases where just one of those is true, you're still in the memory bounds of your array but you are most likely pointing to the wrong vertex.]

-me


sorry, I should have clarified that height and width are equal to m_Height - 1 and m_Width - 1 respectivly, so it never gets to m_Width - 1

Share this post


Link to post
Share on other sites
after generating your vertices, throw in an


assert( vertexList.size() == m_Width * m_Height );



run it compiled in debug mode through your debugger. if you hit the assert debug from there.

-me

Share this post


Link to post
Share on other sites
It's possible that the problem is what the above poster mentioned. He was correct in thinking that DirectX has various index buffer sizes. When you create the index buffer via: device->CreateIndexBuffer(...) one of the format parameters can be either D3DFMT_INDEX32 or D3DFMT_INDEX16. The latter would be too small to hold the values of indices past the mentioned point. Take note that you should check the device cap's for the ability to use D3DFMT_INDEX32 as the documentation states some video cards do not support D3DFMT_INDEX32.

Hope this helps

Share this post


Link to post
Share on other sites
I verified that:

width * height * 6 == numIndices ; width and height are the number of squares wide and high, passed as parameters to the GenerateTerrain() function

and

m_Width * m_Height == numVerts; m_Width and m_Height are width + 1 and height + 1, and represent the width and height in vertices(a line of 10 squares would be 11 verts across)

I've continually crunched the numbers and the math seems sound, which is frustrating

Share this post


Link to post
Share on other sites
Quote:
Original post by livengood
It's possible that the problem is what the above poster mentioned. He was correct in thinking that DirectX has various index buffer sizes. When you create the index buffer via: device->CreateIndexBuffer(...) one of the format parameters can be either D3DFMT_INDEX32 or D3DFMT_INDEX16. The latter would be too small to hold the values of indices past the mentioned point. Take note that you should check the device cap's for the ability to use D3DFMT_INDEX32 as the documentation states some video cards do not support D3DFMT_INDEX32.

Hope this helps


thats exactly what it was, I switched to 32 bit indices and it works fine.

Thanks a lot for the help guys

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this