Primitive vertex order

Started by
2 comments, last by Erik Rufelt 13 years, 3 months ago
Hi! This might be quite easy to solve, but I couldn't figure it out at all. This is one of my first Direct3D applications, since I've been following the C++ SDK tutorial.
I've managed to draw a spinning octahedron using DrawPrimitive, and now I've duplicated the shape so the result would be two touching, spinning octahedrons. The problem is that one of them is always drawn first, so I get this awful effect during the rotation: http://d.imagehost.org/0010/d3d.jpg
Is there any way Direct3D can automatically "sort" the vertex drawing order so the octahedrons cover each other alternatively?
This is my render procedure:
g_lpD3DDev->Clear(0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(0, 0, 0), 1.0f, NULL); g_lpD3DDev->BeginScene();g_lpD3DDev->SetStreamSource(0, g_lpD3DVB, 0, sizeof(VERTEX));g_lpD3DDev->SetFVF(D3DFVF_XYZ|D3DFVF_DIFFUSE);D3DXMATRIX m1;D3DXMatrixIdentity(&m1);D3DXMatrixTranslation(&m1, 1, 0, 0);g_lpD3DDev->SetTransform(D3DTS_WORLD, &m1);g_lpD3DDev->DrawPrimitive(D3DPT_TRIANGLELIST, 0, 8);D3DXMatrixIdentity(&m1);D3DXMatrixTranslation(&m1, -1, 0, 0);g_lpD3DDev->SetTransform(D3DTS_WORLD, &m1);g_lpD3DDev->DrawPrimitive(D3DPT_TRIANGLELIST, 0, 8);DWORD tc = GetTickCount();D3DXVECTOR3 vEye (sin((float)tc/1000)*5, 0, cos((float)tc/1000)*5);D3DXVECTOR3 vLookAt (0.0f, 0.0f, 0.0f);D3DXVECTOR3 vUp (0.0f, 1.0f, 0.0f);D3DXMATRIXA16 m2;D3DXMatrixLookAtLH(&m2, &vEye, &vLookAt, &vUp);g_lpD3DDev->SetTransform(D3DTS_VIEW, &m2);D3DXMATRIX m3;D3DXMatrixPerspectiveFovLH(&m3, D3DX_PI/4, 1.0f, 1.0f, 100.0f);g_lpD3DDev->SetTransform(D3DTS_PROJECTION, &m3);g_lpD3DDev->EndScene();g_lpD3DDev->Present(NULL, NULL, NULL, NULL);

Thanks!
Advertisement
This is best handled by the Z-buffer. You can set the EnableAutoDepthStencil and AutoDepthStencilFormat parameters of your D3DPRESENT_PARAMETERS structure to TRUE and D3DFMT_D24X8 respectively to set it up, and then use SetRenderState to set D3DRS_ZENABLE to TRUE to enable depth testing.
Thanks very much! It works perfectly now.
Here's what the code looks like now:

1. Device initialization:
D3DPRESENT_PARAMETERS d3dp;memset(&d3dp, 0, sizeof(D3DPRESENT_PARAMETERS));d3dp.Windowed = TRUE;d3dp.EnableAutoDepthStencil = TRUE;d3dp.AutoDepthStencilFormat = D3DFMT_D16;d3dp.SwapEffect = D3DSWAPEFFECT_DISCARD;d3dp.BackBufferFormat = D3DFMT_UNKNOWN;g_lpD3D->CreateDevice(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, hWnd, D3DCREATE_HARDWARE_VERTEXPROCESSING, &d3dp, &g_lpD3DDev);g_lpD3DDev->SetRenderState(D3DRS_ZENABLE, D3DZB_TRUE);


2. Render procedure:
g_lpD3DDev->Clear(0, NULL, D3DCLEAR_TARGET|D3DCLEAR_ZBUFFER, D3DCOLOR_XRGB(0, 0, 0), 1.0f, NULL); g_lpD3DDev->BeginScene();/* and further, unmodified code */


More info on: http://www.toymaker.info/Games/html/z_buffer.html
Great. =)
I wouldn't agree with the reasons for using a 16-bit Z-buffer as written on the linked page however. That may be true for very old systems, but even my old Mac from 1998 supports a 24-bit depth buffer, and anything reasonably modern will. It's pretty common to see threads from people wondering why they get rendering artifacts, and the answer is often that they need to change to a 24-bit Z-buffer.

This topic is closed to new replies.

Advertisement