Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


#ActualSteve_Segreto

Posted 15 March 2013 - 01:06 AM

First lets talk about the resources both on disk and as represented in memory by your engine. You have three distinct pieces, polygonal mesh, skeletons (hierarchies of bones), and keyframes (rotations, translation and scales).

 

I suggest a superset of all possible bones. Thus if you support both bipedal and quadrupedal skeletons, your master bone list is a super set of both skeletons. The keyframe data will simply reference this 0-based bone superset when referencing which bone the keyframe(s) should operate on. This allows a single keyframe animation to work on several different character's skeletons.

 

It may or may not be possible to share a skeleton across several polygonal meshes. If possible do this as it will cut back on resources on disk. It will definitely be possible to specify multiple different textures per section of each polygonal mesh. This is a cheap and effective form of 3d palette swapping and can increase your assets considerably.

 

Now when it comes to loading this stuff into memory, we have different concerns than conserving memory. Consider this: It's attractive to think of saving some

memory per actor instance by having similar models share the same skeleton (i.e. every horse shares the horse skeleton, so there is only one copy of the horse skeleton bones and one copy of the horse polygonal mesh and one copy of each unique horse texture). This scheme would require serialized rendering of all horses, as you would need to pose each horses shared set of bones prior to rendering it. Furthermore you would only have one copy of the polygons for the horse mesh and would need to set the correct horse texture prior to rendering the mesh.

 

Additionally, as you have chosen GPU skinning, you must be sensitive to exceeding the number of constants registers by providing skin partitions that contain too many bones. A skin partition is a sub-section of a polygonal mesh (possibly the whole mesh) that is influenced by a subset of bones or textured with a different texture. Therefore your assets must not have skin partitions with more than 50 to 60 bones per partition (for 256 constant registers VS2.0). If your model's skin partitions are approaching this limit, you are again forced to serialize rendering in order to set each skin partition's bone palette to the constants registers prior to rendering.

 

I won't hazard any suggestions on how you can remove these obstacles to parallelization, but I think you see the ramifications of this choice, it will define the scalability of your engine to a great degree.


#2Steve_Segreto

Posted 15 March 2013 - 01:04 AM

First lets talk about the resources both on disk and as represented in memory by your engine. You have three distinct pieces, polygonal mesh, skeletons (hierarchies of bones), and keyframes (rotations, translation and scales).

 

I suggest a superset of all possible bones. Thus if you support both bipedal and quadrupedal skeletons, your master bone list is a super set of both skeletons. The keyframe data will simply reference this 0-based bone superset when referencing which bone the keyframe(s) should operate on. This allows a single keyframe animation to work on several different character's skeletons.

 

It may or may not be possible to share a skeleton across several polygonal meshes. If possible do this as it will cut back on resources on disk. It will definitely be possible to specify multiple different textures per section of each polygonal mesh. This is a cheap and effective form of 3d palette swapping and can increase your assets considerably.

 

Now when it comes to loading this stuff into memory, we have different concerns than conserving memory. Consider this: It's attractive to think of saving some

memory per actor instance by having similar models share the same skeleton (i.e. every horse shares the horse skeleton, so there is only one copy of the horse skeleton bones and one copy of the horse polygonal mesh and one copy of each unique horse texture). This scheme would require serialized rendering of all horses, as you would need to pose each horses shared set of bones prior to rendering it. Furthermore you would only have one copy of the polygons for the horse mesh and would need to set the correct horse texture prior to rendering the mesh.

 

Additionally, as you have chosen GPU skinning, you must be sensitive to exceeding the number of constants registers by providing skin partitions that contain too many bones. A skin partition is a sub-section of a polygonal mesh (possibly the whole mesh) that is influenced by a subset of bones or textured with a different texture. Therefore your assets must not have skin partitions with more than 50 to 60 bones per partition. If your model's skin partitions are approaching this limit, you are again forced to serialize rendering in order to set each skin partition's bone palette to the constants registers prior to rendering.

 

I won't hazard any suggestions on how you can remove these obstacles to parallelization, but I think you see the ramifications of this choice, it will define the scalability of your engine to a great degree.


#1Steve_Segreto

Posted 15 March 2013 - 01:03 AM

First lets talk about the resources both on disk and as represented in memory by your engine. You have three distinct pieces, polygonal mesh, skeletons (hierarchies of bones), and keyframes (rotations, translation and scales).

 

I suggest a superset of all possible bones. Thus if you support both bipedal and quadrupedal skeletons, your master bone list is a super set of both skeletons. The keyframe data will simply reference this 0-based bone superset when referencing which bone the keyframe(s) should operate on.

 

It may or may not be possible to share a skeleton across several polygonal meshes. If possible do this as it will cut back on resources on disk. It will definitely be possible to specify multiple different textures per section of each polygonal mesh. This is a cheap and effective form of 3d palette swapping and can increase your assets considerably.

 

Now when it comes to loading this stuff into memory, we have different concerns than conserving memory. Consider this: It's attractive to think of saving some

memory per actor instance by having similar models share the same skeleton (i.e. every horse shares the horse skeleton, so there is only one copy of the horse skeleton bones and one copy of the horse polygonal mesh and one copy of each unique horse texture). This scheme would require serialized rendering of all horses, as you would need to pose each horses shared set of bones prior to rendering it. Furthermore you would only have one copy of the polygons for the horse mesh and would need to set the correct horse texture prior to rendering the mesh.

 

Additionally, as you have chosen GPU skinning, you must be sensitive to exceeding the number of constants registers by providing skin partitions that contain too many bones. A skin partition is a sub-section of a polygonal mesh (possibly the whole mesh) that is influenced by a subset of bones or textured with a different texture. Therefore your assets must not have skin partitions with more than 50 to 60 bones per partition. If your model's skin partitions are approaching this limit, you are again forced to serialize rendering in order to set each skin partition's bone palette to the constants registers prior to rendering.

 

I won't hazard any suggestions on how you can remove these obstacles to parallelization, but I think you see the ramifications of this choice, it will define the scalability of your engine to a great degree.


PARTNERS