Different behaviour when erroneus binding of index buffer in D3D11

Started by
1 comment, last by Freakill 10 years, 5 months ago

Hi everybody!

First of all I would like to say "Hello :)" as this is my first message in GameDev (despite having been reading it for a while).

I'm doing a project with C++ + DX11 + Kinect SDK (and learning while doing it) for University research. For character animation I'm using Cal3D libraries, that despite being old, the artist I'm working with is familiarized by working in projects where Cal3D was used. When loading a 3D model, to retrieve the data for index buffers Cal3D gives you an array of CalIndex, that is:

typedef int CalIndex

Once created the buffers for vertex and index data, when binding the index buffer calling IASetIndexBuffer I had mistakenly set the format to DXGI_FORMAT_R16_UINT (it came from the manual creation of basic 3D shapes for testing purposes). The funny thing is that in my desktop computer (with an ATI HD 7790), the model geometry rendering was correct:

goodModel.png

Please ignore diffuse lightning glitches, something with the texture I will have to figure out with the artist :)

But, the really funny thing is that I discovered that, when running the same application in my laptop (with a NVIDIA GT520M), the model rendering was obviously incorrect:

badModel.png

After a while I have found the problem and solved it, but not as expected. Despite CalIndex being a typedef of an int, the format that I had to specify is DXGI_FORMAT_R32_UINT. Shouldn't I have specified the format as a 32 bit value but in SINT? And also, why the hell my ATI was rendering correctly the model why my laptop NVIDIA wasn't? might be something related with bus bandwidth, memory or even graphic card way of working?

I must say that despite I'm really proud of having solved this stupid issue, now I feel sad that I cannot figure out really what has been happening and that I have to solve it more by trial and error than by really knowing what I'm doing :/

Thanks for your attention people!

Cheers!

Advertisement

The only allowed IB formats are R32_UINT and R16_UINT, you can't use SINT.

Setting R16 when the buffer is R32 should cause errors on all cards, don't know why it works on AMD - maybe a driver bug.

Thanks N.I.B. for your reply. I didn't know I couldn't use others than UINT... Anyway, a part from that, still trying to discover why it was working properly when it shouldn't. If I find something I will write it here :)

This topic is closed to new replies.

Advertisement