Indices All Jacked Up Between Mac/iOS Build of the Same Code?

Started by
-1 comments, last by Vincent_M 11 years, 11 months ago
Attached below, you'll see screenshots of two scenes: one from a Mac game and one from an iOS game I'm working on. I'm using the same code to load and render the same OBJ file in both programs, but for some odd reason, the indices seem to be VERY different on the Mac build of the app. Could it be my graphics card that could be processing the data differently?

I'm currently storing my indices as an array of unsigned long's. A VBO and IBO for both builds, but I'm getting the same results when I don't use the VBO/IBO.

I'm not sure where the problem lies, so I didn't post any code. If you need to look at something, I'll post it.

EDIT: I output the index list for both the Mac and iOS builds, and the data is loaded the same. I also use GL_UNSIGNED_INT for 'type' in glDrawElements(). Would that cause any issues? From some googling, it appears to be fine.

EDIT2: False alarm, I switch my indices back to unsigned short, and used GL_UNSIGNED_SHORT to as my type to pass index data into glDrawElements(), and this seems to work. It appears that my laptop's graphics card capabilities do not support GL_UNSIGNED_INT.

This topic is closed to new replies.

Advertisement