Terrain rendering using DirectX

Started by
7 comments, last by Jason Z 12 years, 6 months ago
Hello!

I'm getting a stack overflow runtime error when my vertex/index buffers exceed a certain size. What I'm trying to do is to render a huge terrain generated using noise functions.

How can I avoid that problem? I'm using native C++ and DirectX 11.

Here's the render function:

void Render()
{
float color[4] = {0.2f, 0.3f, 0.4f, 1.0f};
g_pImmediateContext->ClearRenderTargetView(g_pRenderTargetView, color);

ConstantBuffer cb;
cb.mWorld = XMMatrixTranspose(g_World);
cb.mView = XMMatrixTranspose(g_View);
cb.mProjection = XMMatrixTranspose(g_Projection);
g_pImmediateContext->UpdateSubresource(g_pConstantBuffer, 0, NULL, &cb, 0, 0);

g_pImmediateContext->VSSetShader(g_pVertexShader, NULL, 0);
g_pImmediateContext->VSSetConstantBuffers(0, 1, &g_pConstantBuffer);
g_pImmediateContext->PSSetShader(g_pPixelShader, NULL, 0);
g_pImmediateContext->DrawIndexed(WIDTH * HEIGHT * 6, 0, 0);

g_pSwapChain->Present(0, 0);
}


A size of 128 * 128 (128 * 128 * 2 triangles) is working fine but 200 * 200 (and above of course) causes a stack overflow error.

Thanks in advance!
Advertisement
How large is the vertex buffer you are using to do the drawing?
Vertex buffer:

SimpleVertex vertices[(WIDTH + 1) * (HEIGHT + 1)];
// Here all the vertices are added
D3D11_BUFFER_DESC bd;
ZeroMemory(&bd, sizeof(bd));
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = sizeof(SimpleVertex) * (WIDTH + 1) * (HEIGHT + 1);
bd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bd.CPUAccessFlags = 0;

Index buffer:

WORD indices[WIDTH * HEIGHT * 6];
// ...
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = sizeof(WORD) * WIDTH * HEIGHT * 6;
bd.BindFlags = D3D11_BIND_INDEX_BUFFER;
bd.CPUAccessFlags = 0;


After reading this thread I suspect that the index buffer is the problem because
the max. index with WORD indices is 65535[/quote].
Exactly, you have pinned down the origin of your problem yourself, a WORD is a 16 bit integer, so your unsigned indice can't go over 65535.

If you use a DWORD instead (double word), you have up to 32 bits to store your indices.
I doubt it. Even with the big grid he gets only 201*201 = 40401 vertices. Also, if that was really the matter, I rather expect a wrong rendering or a DX API call failure than a stack overflow.

I suspect the error coming from this


SimpleVertex vertices[(WIDTH + 1) * (HEIGHT + 1)];


This big array will be created on the stack. Does it exceed your currently set stack size (or the default size which is I think 1 Mb ) ?
Rather create the temp array on the heap with new (don't forget to delete[] afterwards). Edit: Same applies to indices, of course.

It would help to know when the stack overflow is thrown.
You are right unbird. It's working now, thanks for your help!

I should probably learn some more C++ before messing around with DirectX.

I doubt it. Even with the big grid he gets only 201*201 = 40401 vertices. Also, if that was really the matter, I rather expect a wrong rendering or a DX API call failure than a stack overflow.

I suspect the error coming from this


SimpleVertex vertices[(WIDTH + 1) * (HEIGHT + 1)];


This big array will be created on the stack. Does it exceed your currently set stack size (or the default size which is I think 1 Mb ) ?
Rather create the temp array on the heap with new (don't forget to delete[] afterwards). Edit: Same applies to indices, of course.

It would help to know when the stack overflow is thrown.


16 bit indices should also have been a problem - he is using 6 indices for each quad. This means that it would be 200*200*6 = 240,000. Although I am sure that the stack overflow is caused by the problem you described...

16 bit indices should also have been a problem - he is using 6 indices for each quad. This means that it would be 200*200*6 = 240,000. Although I am sure that the stack overflow is caused by the problem you described...[/quote]

but the maximum index value would still be 40k-something, since hes only got 40k vertices, and therefore fit into an unsigned short. if he wanted to use WORD for indexing into the index array he would have a problem, but that is only done by the GPU itself...
anyway, if your vertexarray has less than 2^16 entries (65k), WORD is a suitable index size, no matter how many indices are in the index buffer.
sorry for being a smartass :)

cheers,
tasche


16 bit indices should also have been a problem - he is using 6 indices for each quad. This means that it would be 200*200*6 = 240,000. Although I am sure that the stack overflow is caused by the problem you described...


but the maximum index value would still be 40k-something, since hes only got 40k vertices, and therefore fit into an unsigned short. if he wanted to use WORD for indexing into the index array he would have a problem, but that is only done by the GPU itself...
anyway, if your vertexarray has less than 2^16 entries (65k), WORD is a suitable index size, no matter how many indices are in the index buffer.
sorry for being a smartass :)

cheers,
tasche
[/quote]That is right :) I missed the fine point there...

This topic is closed to new replies.

Advertisement