Jump to content
  • Advertisement
Sign in to follow this  
ChugginWindex

Practicality of index buffers

This topic is 2996 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts


Position, UV coords, Normals etc are in floating point precision. How could a memcmp() possibly hold up under those circumstances? Event seemingly identical values will be slightly different and therefore have different binary values.


floating point thats that are assigned via assignment operatior / constructor will be perfect bitwise representations of the source they were assigned from, so memcmp works (so long as you don't perform any arithmetic on the value)

Of course, if you want a "close enough to weld" test, then you have to do it manually.

Share this post


Link to post
Share on other sites
Advertisement

All of these problems can obviously be solved by taking the index buffer generation process offline as we've already discussed, but my curiosity is getting the better of me and I'm starting to wonder if there aren't some key tricks to this entire thing that I'm missing that go beyond simple index buffer construction. Stuff like dynamic vertex attribute specifications as opposed to static ones (I've got a fairly simple dynamic system implemented at the moment, but it's the exact reason I'm having trouble with a lot of this fast comparison stuff) and type identification (are these values being sent to the GPU floats or ints?). Is there some sort of advanced book on best practices somewhere that I can pick up for this? I've got a copy of the superbible and a few others, but none of them do much more than skim the surface on how to apply this well in a decent scope.

Moving it offline does not solve anything.
Here, I am discussing how to make your processing of vertices faster, whether online or offline.
As mentioned above, you want an exact comparison. This is actually important for a few reasons but I will not go into that now.

You mentioned calculating normals. Why would you need to do this? This should be part of the file data that you read and store into your buffers.
In fact it must be part of the data you read from the file, because you don’t otherwise know if a given edge should have soft normals or hard normals. If you have a cube, its edges will be hard. The normal can be calculated as the normal of the triangle for each face.
From the sounds of it (you have flat shading at the moment), you are doing this for every normal, which is a flaw in any case when you want the surface to be smooth/rounded.
To get these normals, at each vertex, you have to average the normals of all of the triangles that share that vertex.

But again, the 3D software used to make the model will do this for you, because the author of the 3D model may specify that this edge is hard (use the normal of the triangle as the normal of the vertex) or soft (average the normals of neighboring triangles).

Normals should always be read from the file data.


L. Spiro

Share this post


Link to post
Share on other sites

From the sounds of it (you have flat shading at the moment), you are doing this for every normal, which is a flaw in any case when you want the surface to be smooth/rounded.
To get these normals, at each vertex, you have to average the normals of all of the triangles that share that vertex.

But again, the 3D software used to make the model will do this for you, because the author of the 3D model may specify that this edge is hard (use the normal of the triangle as the normal of the vertex) or soft (average the normals of neighboring triangles).

Normals should always be read from the file data.


That's what I want to do right now, is average normals of all the triangles that share a vertex. I can't do that because I don't HAVE the vertices indexed at the moment so there's no notion of sharing, which is why was under the impression that I needed an "almost" comparison instead of an exact one.

The reason the model is not providing normals is because I'm currently working with the Stanford OBJ files which don't provide them (rabbit, buddha, dragon etc.). It's just raw vertex data. If you can point me to versions of these OBJ files that contain the normals, along with an assurance that the scenario I'm dealing with (calculating normals myself) is something I shouldn't have to ever worry about, I'd be happy to drop this and go my merry way :)

Share this post


Link to post
Share on other sites
I can’t point you to normal-included versions of those models because I never used those models.
I know they are helpful for getting a fast result, but in reality every model you load will either have normals included or have no need for normals.

I would suggesting dropping this line of code and not to waste more time parsing OBJ files.
FBX and COLLADA are what you should be parsing; everything can be converted to and from FBX and COLLADA.

You should also get real-world model data, with textures and normals.
http://lspiroengine.com/?p=73 All of these high-quality models came from http://www.gfx-3d-model.com/ and are all free.
If you insist on sticking to OBJ, they usually have OBJ files for each model as well.


L. Spiro

Share this post


Link to post
Share on other sites

I would suggesting dropping this line of code and not to waste more time parsing OBJ files.
FBX and COLLADA are what you should be parsing; everything can be converted to and from FBX and COLLADA.


That's my end goal. I've just never done any model loading before so I started with the basics. COLLADA and FBX are great formats that I looked at as well, but ultimately the amount of extra information that they contain made manually loading them a bit of a nightmare to start with. I don't want to use something like ASSIMP because I don't like including monolithic codebases for individual uses in a project like that. I'll move to it soon enough however, as the OBJ stuff is starting to annoy me.

Your site is impressive as well, and I hadn't found that website you linked to when looking for free models. Thanks for that!

Share this post


Link to post
Share on other sites

As mentioned above, you want an exact comparison. This is actually important for a few reasons but I will not go into that now.

Please do go into that.


Share this post


Link to post
Share on other sites
If I had to guess, one of the main reasons is the exact opposite of why I initially thought that I didn't want an exact match: Anything but an exact match is going to creating vertex sharing where there should be none. You'll lose information such as proper UV coordinates, or normals, or both which can be a real problem on discrete boundaries like the faces of a cube. If there's more to it than that, I'd be interested in hearing the justifications as well :)

Share this post


Link to post
Share on other sites

Anything but an exact match is going to creating vertex sharing where there should be none. You'll lose information such as proper UV coordinates, or normals, or both which can be a real problem on discrete boundaries like the faces of a cube. If there's more to it than that, I'd be interested in hearing the justifications as well :)

I don't understand that. If the vertices are virtually identical, why wouldn't I want to share them?

Share this post


Link to post
Share on other sites

I don't understand that. If the vertices are virtually identical, why wouldn't I want to share them?


Because your notion of "virtually identical" and my notion of it can be different, and more importantly your's may not be the same as that which the model was exported with. If the model specifically states that the three vertices for a corner of a cube contain the same common position but distinctly different normals or uv coordinates, you don't want your model loader / builder making assumptions such as "normals can be different as long as the positions match" when determining index-able vertices or else you'll lose that information.

Share this post


Link to post
Share on other sites

I don't understand that. If the vertices are virtually identical, why wouldn't I want to share them?


How do you define the tolerance for "close enough"? What happens if you're dealing with geometry thats on the order of 10km, vs 10mm?
With normals, an artist can specify a hard edge with a miniscule variation of the normal on either side, but its still a hard edge.

If you are worried about floating point precision issues from the results of parsing the values from the ASCII, then use the ASCII string to determine if they are identical.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!