Game Framework: Interaction between Render & Effects & Objects & Models

Started by
-1 comments, last by Atom 14 years, 2 months ago
Currently I'm working a framework for my next project, and I'm having an issue trying to find an elegant solution to making a rendering loop. My currently loop is working well but it seems like its going in the wrong way, and I won't know it till I hit a dead end. I was curious how some of you handle your interactions between the effect manager, renderer, objects, and models. And what you think about my current setup. Oh currently I haven't implemented a scene graph cause right now I'm just dealing with trying to render at least 100 objects and maintain a good frame rate. Here is how my interactions work so far. When I create a new object and its added to the object manager, that object gets the model id of what mesh it wants to use from the model manager (static model or dynamic model), also it gets what effect id from the effect manager of what effect it wants to use to render the object and its model. I use the model/effect id's so that multiple objects can use the same type of model and effect. Once the object has determined what model and effect it is going to use it then binds all its data to a parameter map in the effect (that has the effect id). What I mean by that is, the object passes references to that effect's id parameter map of all its variables (world matrix, etc..). Also the object tells the model to bind all of its subset texture id's to the parameter map. So now each object has a effect id, model id, and parameter map id. By binding all the data to the parameter map, this means that when the object needs to be drawn, the effect knows in memory where all that objects variables are, and where all the textures are in memory that are used to renderer the meshes subsets. On a side note I added the ability to allow an object to use a different texture map (model subset textures). This allows me to allow the player to create a custom texture map (UV map) for all the subsets for a mesh, and use that instead of the texture map that is loaded with the mesh at creation time. Since every texture is loaded through the texture manager, every object now has to have a texture map id, model id, parameter map id, and effect id. Also each effect in the effect manager has a list of constant parameters (lights, fog, etc..) that an effect file uses and that won't change for every object using this effect. But can change every frame if needed by just updating the variable that is binded (referenced) to the effect. So after updating(player movement, ai, collision) every object in the object manager, its time to renderer. The renderer will tell the effect manager to render all the current parameter maps for each of its effects (multiple passes, multiple subsets, however the parameter map is laid out). To me this is starting to look like all the rendering is really left to the effect manager, cause it contains parameter maps that contain references to the location in memory of all the active objects/models variables. And if a object is using a model, the parameter map contains the information of what textures (UV maps) belong to what subset in a model. So the effect just loops through all the subsets, setting each texture for that subset, then telling the model to call its mesh->drawsubset(i). Sorry for the massive wall of text. I was just hoping someone could review my interactions, or say how the are handling their interactions between these systems. This stuff is really starting to make my brain explode. Thanks for taking the time to read and help : ) Oh PS: in each parameter map I'm able to let the effect manager know which objects parameter maps are not going to be rendered this frame due to culling or any other reason. And if an object dies, the object unbinds itself from its parameter map, so it can safely be removed.

This topic is closed to new replies.

Advertisement