Updating mesh representation when mesh changes in rendering system

Started by
8 comments, last by Shaarigan 6 years, 6 months ago

I am wrapping my head around dynamic meshes. At the moment I want my mesh to be able to change it's geometry during execution, and let all dependent systems know they have to update their state based on the mesh-data (primarily the rendering system). I have an event system in place in which I can send mesh-changed events, but Im wondering if sending this event from inside my mesh class makes sense. It feels a bit of mixing responsibilities, because I like my mesh to be as elementary as possible. 

How to solve the updating of the mesh representations inside the rendering system in a clean way?

Advertisement

I'm not sure why do you want to send some events, you just manipulate this dynamic mesh and render it each frame "as is" - if it changes, it will be rendered as changed. Why rendering system should update it's state based on some mesh that changed? It just gets a task to render mesh and does this, it doesn't care if the mesh never changes or just changed in last frame. Or am I missing something here?


Where are we and when are we and who are we?
How many people in how many places at how many times?

The mesh has a collection of vertices, triangles and assigned materials and an interface to be able to change the geometry in a convenient way. It is not stored in an efficient way to render immediately. For rendering I am using OpenGL, so I must create a vertex buffer object and store the data on the GPU. So what I have is decoupled the mesh representation from how the rendering system represents the object.

When the mesh geometry changes I can directly instruct the GPU to update it's representation, but I thought that an event system seems much more convenient to have less coupling between the systems.

So at this moment when I update the vertices or triangles, the mesh geometry changed and I'm sending a "mesh geometry changed" event. These events are sent throughout the Mesh class, but I am not sure if this is a proper way to solve this problem.

If I understand correctly your mesh just shoots out "Mesh Changed" events whenever it needs to, and other systems are essentially 'subscribed' to these events? It sounds like this isn't a bad way to solve the problem. The mesh doesn't know/care about what happens to other systems when it updates, seems like a good separation of concerns to me. Maybe someone more experienced with graphics programming can chime in with their experience.

Considering this method doesn't seem obviously bad, I think whats most important is to keep working but pay close attention to how this decision is affecting the rest of your project. If your code starts getting messy because of this, make the call and refactor. Best practices can only take you so far, often times the "best" way to solve a problem depends a lot on the specifics of your project. Developing a good sense of objectivity and awareness of your architectural decisions is invaluable :) The best way to do that is to bite the bullet and make some decisions!

Normally the one that changed the mesh should also trigger a vbo update so that non of your systems except that one that made the changes are responsable for pushing the GPU update

Seconding noizex: Assuming we're discussing a game setup here, it is usually not necessary to send events at all. Please notice that a game's runloop is a well defined sequence of processing steps. Mesh generation, for example, is done before graphical rendering happens. Mesh deformer calculation (skeleton animation, ...), for example, is done before mesh generation. Moreover, the particular processing steps should not do tasks they are not responsible for (a.k.a. single responsibility principle). The graphical rendering should not know about the structure of the manipulated mesh, because its job is to "just" feed the GPU.

On 10/16/2017 at 3:58 AM, LargeJ said:

 I have an event system in place in which I can send mesh-changed events, but Im wondering if sending this event from inside my mesh class makes sense.

This doesn't make much sense. The Mesh class is responsible for storing the mesh data, it should be the listener for this event, not the one sending it. What ever is manipulating the mesh should send the event. The Mesh class should listen for it, then update its buffers/data and then maybe send a render state changed event if needed (I'm assuming you're trying to load a different model/change the way the data is stored in the buffer). Then the renderer can listen for this and do whatever. However, I don't think you'll have to really change the render state if the geometry is deformed/changed. You might have to change the current shader instead, depending on what you're trying to achieve. I don't know how you're trying to change the geometry, so you may need to use glBufferSubData().

Thanks for all the input. I am actually looking for an easy and efficient way to update mesh data, because I like to experiment applying shaders, changing geometries, instancing and such all on the fly without having to worry too much about calling the render system to update the representation. 

I was thinking about representing a mesh as a pure data structure, only containing triangles, vertices, etc and some useful methods to manipulate the mesh data. This mesh must be transformed to 'my-engine specific' vertex buffer description that the render system consumes (and stores into an openGL-vertex buffer object).

The digestion of the vertex buffer object leads to a collection of render calls in the render system. These calls are executed every frame and can be sorted for the least-amount of state changes. I was considering the following use case:

  • As a user of my engine, I like to easily change the appearance of a mesh, by assigning new materials to any (random) combination of triangles after the mesh has already been created.

For this to work the amount of render calls in the render system must be updated: I must find all render calls that belong to this geometry (leading to coupling between mesh and render system) and then possibly adding more render calls (because a single render call for 500  triangles with material A, can now be split into 200 triangles material A and 300 material B.

Q: But maybe I must just limit the interface to more common easy ways to change the appearance of the mesh, and if the user wants to change more specific things, it is responsible to call the render system to update the GPU representations.?

Take a look on Unity 3D's mesh class even when I do not suggest this very often, they have a good user system I used to generate procedural grass on a large terrain. I think the mesh itself is coupled to the underlaying VBO/VAO and as I wrote above, an update is triggered exactly by the system calling this code. You do not need to touch your render system because it works with the exact same VBO/VAO as your mesh does

This topic is closed to new replies.

Advertisement