Index buffer that's bigger that WORD

Started by
9 comments, last by Ataru 22 years, 11 months ago
I''ve been having major bugs with my DX8 code (I''m an OpenGl guy learining DX) and my index buffer worked great until I had over 66???? indices. I realised, since my indicies were of type WORD, I had gone over bounds, so I wanted to change to Unsigned Long. Well, I can''t seem to do it, if I change my index buffer to anything but WORD all my geometry is messed up. Right now my index buffer creation looks like this (trying with plain old ints right now) hr = pID3DDevice->CreateIndexBuffer(sizeof(int) * num_elems, // in bytes, not indices D3DUSAGE_WRITEONLY, D3DFMT_INDEX32, D3DPOOL_DEFAULT, &pIndexBuffer); int *pIndex; hr = pIndexBuffer->Lock(0, 0, (BYTE **)&pIndex, 0); if(FAILED(hr)) { #ifdef DEBUG_ON fprintf(pLOG,"DX8 D3D GetAdapterDisplayMode HAS FAILED, ERROR:\n "); errorLog(hr); #endif closeDINPUT(); shutdownD3D(); exit(0); } for(ii = 0; ii < num_elems; ii++) { pIndex[ii] = (int)indices[ii]; } hr = pIndexBuffer->Unlock(); If anyone sees something wrong, or if anyone can just post their index buffer initialization, that would be great. Thanks in advance
Advertisement
Have you tried using DWORD?
Joseph FernaldSoftware EngineerRed Storm Entertainment.------------------------The opinions expressed are that of the person postingand not that of Red Storm Entertainment.
Hello,

I have Geforce2 and although it supports 32bit indices under OpenGL, it only supports 16bit WORDs under Direct3D8. You get a nice "scrambled" effect when using ints!

Do you mean it stopped working when you got over 66 indices or over 66*** indices? (i.e. 65535) .

It should support more than 66, maybe your indices array is broken?

Hope That helps

No, it''s 65535. So there''s no way to support more? Damn.

So is the only thing I can do, is make calls to drawPrimitive then? Or are there ways to draw more vertices than 65535?
What you need to do is use the GetCaps/GetDeviceCaps functions (At work so I''m not able to look up exact name), then check dwMaxVertices or some such member, again not sure of the exact name. That''s the max vertices the card can render in one call. After you see that you have to divide up any of your buffers that are bigger.
I had exactly the same problem with a GeForce (1).

SeanHowe is right : Call GetDeviceCaps() and
check D3DCAPS8::MaxVertexIndex.

The not so funny thing is that 32 bit index buffers are only supported if MaxVertexIndex > 0xFFFF; so you can''t render a single triangle using a 32-bit index buffer if MaxVertexIndex <= 0xFFFF.
I haven''t checke yet, but I assume it''s not supported. So what are my options?

I''m new to DX8, like i said, so do I have to

a) Just use draw primitive?
or
b) Swap buffers, (in this case how would it be done efficiently, I can''t see creating a new vertex buffer for every draw being very efficient.

How do games that use DX8 do this, do they have different vertex/index buffers and use portals? (Probably have to I guess)
Break your object up as much as needed or use triangle strips...the later is probably better

-----------------------------------------------------------
"People who usualy use the word pedantic usualy are pedantic!"-me
-----------------------------------------------------------"People who usualy use the word pedantic usualy are pedantic!"-me
If your card supports it, try: D3DFMT_INDEX64?. There is another flag where you use D3DFMT_INDEX32 to index larger indices.

What the hells!
What the hells!
Well, actually I'm rendering 5 height fields at 129x129 resolution (this is a 4th year projected on fractally constrained fractal generation, so I don't care about speed I need to render it all)

Do I just use drawPrimitive for that, or is there a way to have multiple vertex buffers?

(And I do use triangle strips)

PS. I did check, my Geforce 2 only supports 16 bit indexes. That's the dumbest thing I've ever heard of, but what can you do.

Edited by - Ataru on April 29, 2001 9:41:27 AM

This topic is closed to new replies.

Advertisement