Jump to content
  • Advertisement
Sign in to follow this  
Vincent_M

Indices All Jacked Up Between Mac/iOS Build of the Same Code?

This topic is 2267 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Attached below, you'll see screenshots of two scenes: one from a Mac game and one from an iOS game I'm working on. I'm using the same code to load and render the same OBJ file in both programs, but for some odd reason, the indices seem to be VERY different on the Mac build of the app. Could it be my graphics card that could be processing the data differently?

I'm currently storing my indices as an array of unsigned long's. A VBO and IBO for both builds, but I'm getting the same results when I don't use the VBO/IBO.

I'm not sure where the problem lies, so I didn't post any code. If you need to look at something, I'll post it.

EDIT: I output the index list for both the Mac and iOS builds, and the data is loaded the same. I also use GL_UNSIGNED_INT for 'type' in glDrawElements(). Would that cause any issues? From some googling, it appears to be fine.

EDIT2: False alarm, I switch my indices back to unsigned short, and used GL_UNSIGNED_SHORT to as my type to pass index data into glDrawElements(), and this seems to work. It appears that my laptop's graphics card capabilities do not support GL_UNSIGNED_INT. Edited by Vincent_M

Share this post


Link to post
Share on other sites
Advertisement
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!