Sign in to follow this  

Indices All Jacked Up Between Mac/iOS Build of the Same Code?

This topic is 2049 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Attached below, you'll see screenshots of two scenes: one from a Mac game and one from an iOS game I'm working on. I'm using the same code to load and render the same OBJ file in both programs, but for some odd reason, the indices seem to be VERY different on the Mac build of the app. Could it be my graphics card that could be processing the data differently?

I'm currently storing my indices as an array of unsigned long's. A VBO and IBO for both builds, but I'm getting the same results when I don't use the VBO/IBO.

I'm not sure where the problem lies, so I didn't post any code. If you need to look at something, I'll post it.

EDIT: I output the index list for both the Mac and iOS builds, and the data is loaded the same. I also use GL_UNSIGNED_INT for 'type' in glDrawElements(). Would that cause any issues? From some googling, it appears to be fine.

EDIT2: False alarm, I switch my indices back to unsigned short, and used GL_UNSIGNED_SHORT to as my type to pass index data into glDrawElements(), and this seems to work. It appears that my laptop's graphics card capabilities do not support GL_UNSIGNED_INT. Edited by Vincent_M

Share this post


Link to post
Share on other sites

This topic is 2049 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this