LockIndexBuffer from .X file, WORD or DWORD index?

This topic is 3416 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

I've modified the Xparser codes in the book by Jim Adams, the codes read many .X correctly, but sometimes it works for only WORD, sometimes DWORD. WORD* SrcIBptr = NULL; or DWORD* SrcIBptr = NULL; if(FAILED(m_pMeshCon->MeshData.pMesh->LockIndexBuffer(D3DLOCK_READONLY, (VOID **)&SrcIBptr)))return E_FAIL; I have to manually test to see if that's WORD or DWORD. Anyone know the way to tell if the .X file have WORD/DWORD index? Much Thanks~!!

Share on other sites
You can use GetIndexBuffer() to get the IDirect3DIndexBuffer9 interface, then use IDirect3DIndexBuffer9::GetDesc() to get a D3DINDEXBUFFER_DESC, which contains a Format member. If it's D3DFMT_INDEX16 then your indices are 16-bit WORDs, if it's D3DFMT_INDEX32 then they're 32-bit DWORDs. And if it's anything else, it's an invalid format.

Share on other sites
Hi,

I think it's related to index size of the mesh.

I mean, if that mesh uses 16-bit indices, you can say "It uses WORD index". Because, WORD means double-byte = 8 + 8 bit = 16-bit. But if it uses 32-bit indices, now you can say "It uses DWORD index". Because, DWORD means Double-Word = 16 + 16 bit = 32-bit.

I wish this code snippet can help you to retrieve the index size:
//...LPDIRECT3DINDEXBUFFER9 ib;pMesh->GetIndexBuffer (&ib);D3DXINDEXBUFFER_DESC ibdesc;ib->GetDesc (&ibdesc);UINT IndexSize = ibdesc.Size;//...

If I'm wrong, please correct me ;)

Hope I could explain :)

Regards.

Share on other sites
Quote:
 Original post by programci_84Hi,I think it's related to index size of the mesh. I mean, if that mesh uses 16-bit indices, you can say "It uses WORD index". Because, WORD means double-byte = 8 + 8 bit = 16-bit. But if it uses 32-bit indices, now you can say "It uses DWORD index". Because, DWORD means Double-Word = 16 + 16 bit = 32-bit.I wish this code snippet can help you to retrieve the index size://...LPDIRECT3DINDEXBUFFER9 ib;pMesh->GetIndexBuffer (&ib);D3DXINDEXBUFFER_DESC ibdesc;ib->GetDesc (&ibdesc);UINT IndexSize = ibdesc.Size;//...If I'm wrong, please correct me ;)Hope I could explain :)Regards.
The Size member of the D3DINDEXBUFFER_DESC struct is the size in bytes of the buffer, not the size of one index.
Also, you could have a mesh with < 65536 vertices, but may still use a 32-bit IB, or one with > 65536 vertices that uses a 16-bit one, so checking the number of vertices isn't viable either.

Share on other sites
Quote:
 The Size member of the D3DINDEXBUFFER_DESC struct is the size in bytes of the buffer, not the size of one index.Also, you could have a mesh with < 65536 vertices, but may still use a 32-bit IB, or one with > 65536 vertices that uses a 16-bit one, so checking the number of vertices isn't viable either.

Thanks for correcting.

Share on other sites
Quote:
 Original post by Evil SteveYou can use GetIndexBuffer() to get the IDirect3DIndexBuffer9 interface, then use IDirect3DIndexBuffer9::GetDesc() to get a D3DINDEXBUFFER_DESC, which contains a Format member. If it's D3DFMT_INDEX16 then your indices are 16-bit WORDs, if it's D3DFMT_INDEX32 then they're 32-bit DWORDs. And if it's anything else, it's an invalid format.

Thanks Evil Steve, I've tried GetDesc and that works fine.

Another question,
can I know the index format from the mesh template in the .X file?

template MeshFace {
<3d82ab5f-62da-11cf-ab39-0020af71e433>
DWORD nFaceVertexIndices;
array DWORD faceVertexIndices[nFaceVertexIndices];
}

template Mesh {
<3d82ab44-62da-11cf-ab39-0020af71e433>
DWORD nVertices;
array Vector vertices[nVertices];
DWORD nFaces;
array MeshFace faces[nFaces];
[...]
}

I can't find where it is noted.
It's always "DWORD" in all .X files,
but sometimes it has actually 16bit indices.

Share on other sites
A .X file doesn't specify whether it has 32-bit or 16-bit vertices, you specify that through the Options parameter of D3DXLoadMeshFromX.

Share on other sites
Quote:
 Original post by MJPA .X file doesn't specify whether it has 32-bit or 16-bit vertices, you specify that through the Options parameter of D3DXLoadMeshFromX.

hi MJP,

I don't think so, some .X files generate index-16 and some generate index-32,
the parser codes is always the same.

And also, how can I specify it?
f(FAILED(D3DXLoadSkinMeshFromXof(pDXData, D3DXMESH_SYSTEMMEM, pD3DDevice, &AdjacencyBuffer,&MaterialBuffer, NULL,&NumMaterials, &pSkin,&pLoadMesh)))assert(0);

Share on other sites
Quote:
 Original post by Quaid TsengI don't think so, some .X files generate index-16 and some generate index-32,the parser codes is always the same.And also, how can I specify it?*** Source Snippet Removed ***
You pass D3DXMESH_32BIT as the second parameter to D3DXLoadSkinMeshFromXof(). I've not used ID3DXMesh, but I wouldn't be surprised if the mesh will automatically load as 32-bit indices if it has to (It needs access to more than 65536 vertices in one draw call), and the D3DXMESH_32BIT flag just forces it to use 32-bit indices when it would normally use 16.

Share on other sites
Hmm, I agree with you.
Now I've attached D3DXMESH_32BIT to both
pMeshCon->MeshData.pMesh->CloneMesh()

But for some .X files, it still turns out to be index-16, weird.

Thanks you guys.
At least, my codes now can automatically tell if it's 16 or 32 index,
by using GetDesc().

1. 1
Rutin
25
2. 2
3. 3
4. 4
JoeJ
18
5. 5

• 14
• 14
• 11
• 11
• 9
• Forum Statistics

• Total Topics
631757
• Total Posts
3002140
×