Hello! I am hoping that someone may be able to help me with this, I have had no luck searching through the internet. I have been reading Frank Luna's book "3D game programming with DirectX 11". All going well, except when I got to the point where I load up a skull from file and the vertex buffer seemed to become to large - but in case someone is reading this who is not familiar with that book, I narrowed the problem down to it's bare bones. I follow an algorithm that creates some hills - as in Franks book it is just a grid that has as y-coordinates a function of cosines. If I make the grid say 100 by 100, and this 10000 points it looks fine. If however I try to increase the detail, and say try to make a grid of 300 by 300, then the grid ends up being, for want of a better way of putting it, completely messed up - it looks like triangles are jumbled up, some taking coordinates from almost random places, if I load up a skull shape for example, it almost looks like bits of skull are connecting to bits of hill. I notice that if I make my vertex buffer objects a bit smaller, for example get rid of the .Color part of it, then I can draw the hills and skull successfully, but if I increase the detail further, then again everything gets messed up, so I am inclined to believe that the problem has something to do with the amount of memory taken up by the vertex buffer. I have a simple example that reproduces the problem, a c++ visual studio solution file. I tried getting this answered on stackoverflow.com, but got firmly knocked back because it apparently is a great sin to suggest supplying code, I wanted to supply the solution because there is no obvious 10 lines of code that are wrong to me. I have attached the visual studio solution, to reproduce the bug, search for "showBug" and set to true (or false to hide it!).[attachment=27650:BackgammonDesktop.zip]
Cheers!