Sign in to follow this  

Geometry management in 3d engines

This topic is 4222 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello I'm developing a 3d engine (api-independent) and trying to figure out a good way of managing geometry (primitives composed of vertices stored in buffers). I want to make things easy for the client programmers and don't want them to have to worry about how and where the geometry data is stored. The idea is that the client programmer loads models, terrain, etc. and the engine takes care of the rest. If the programmer wants a simple textured sphere he simply creates a sphere object, sets appropriate effect and parameters, sets the camera and transforms, and voila! So where do I start? How do I implement this geometry management? Any ideas? Thanks Opwiz

Share this post


Link to post
Share on other sites
I think this is rather a question or a Material & Shader implementation
Have a look at this link to get started

http://www.gamedev.net/community/forums/topic.asp?topic_id=169710


My own system is pretty similar to the one described in the thread I posted above

I abstract the whole thing like this:

Textures: just hold the information that are need to use a certain texture for a surface

Material: describes material types, the number of layers a certain material has and which textures to use for which material

+ what effects that are required to render a surface with this material
+ what kind of effects are in relation with the material like bullet hits in a fps
.... I do this with callback functions that are setup at loadtime


then i have effect classes that describe different effect types
e.g.: effect-type: graphics , sound , particle .....

and finally I got a shader class that registers a dll that is capable of rendering a certain effect or play some sounds, spawn a particle effect or other things ....

The whole system can be implemented with less than 5000 lines of codes and is extremly extendable which is what i am aiming for

Share this post


Link to post
Share on other sites
Pretty sure the OP doesn't want info on a material system, I think he rather needs an abstract way to represent vertex/index data so that it is API agnostic. What are your needs, Opwiz? Does it need to be dynamic data that can be changed a lot, or are you just doing static data? If it's static you can start by having a Geometry class, with 3 member functions, Fill(Vertex* verts, int count), Upload() and Activate() (maybe Render() as well).

The fill function puts all the vertices in virtual memory, then Upload creates the hardware representation (vertex buffer, a vbo - etc) and activate it binds it however needed. How does that sound?

For this to work though, you need to a Direct3DGeometry and OpenGLGeometry class, and then you'll probably have a Geometry interface, and a factory pattern to create them.

Hope that helps, if you want some code samples, ask away! :)

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I tried to create such a system. I ended up with two classes for index and vertex buffers (using the directX API, or VBO's on openGL). THen I have a Vertex Format class, wich determines how data in the buffers are used to draw things (from wich buffer to get the positions, normals). I think this is generic enough.

I think there that for static geometry, you could use a simple "geometry" class, holding all data. It could allow to share vertex buffers between different models, thus reducing state change overhead.

Share this post


Link to post
Share on other sites
Quote:
Original post by acid2
What are your needs, Opwiz? Does it need to be dynamic data that can be changed a lot, or are you just doing static data?

It is hard for me to foresee what I will need (making it tempting to just wrap all direct3d vertex buffer, vertex etc. classes). But as I see it I can get a long way with just static geometry.

Quote:

If it's static you can start by having a Geometry class, with 3 member functions, Fill(Vertex* verts, int count), Upload() and Activate() (maybe Render() as well).

Ok and I suppose I should assume the geometry is composed of triangles and that the vertex has a certain format? Or is it a good idea to support every possible vertex format there is, and other primitives?

Thanks for your reply.

Share this post


Link to post
Share on other sites
Quote:
Original post by Basiror
I think this is rather a question or a Material & Shader implementation
Have a look at this link to get started

http://www.gamedev.net/community/forums/topic.asp?topic_id=169710

That is a nice thread but the material & shader implementation is another issue that I think I've got pretty much figured out. Thanks anyway.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
acid2,
"The fill function puts all the vertices in virtual memory"

does this mean that you alloc systemmemory and copy the verts in this buffer and hold this buffer as a cache for things like d3d's released-device handling?
or what is the reason that you do not only use the hardware-vertexbuffer?

Share this post


Link to post
Share on other sites
Effectivly, I am uploading it all and storing it on hardware. However, if I was doing that on each "set-vertex"-esque call, that'd be lots of locks and unnecessarily slow. It's just a way of deferring it into one big write, that doesn't depend on the user doing it a certain way.

Regarding vertex format, I'd just go with something that can hold everything you will need for now. Yes, it's extra memory, but if your storing pos/normal/tang/binorm etc, you'll probably use them all the majority of the time anyway.

Share this post


Link to post
Share on other sites
Right now I'm working on this part of my engine, I'll explain the solution I currently ended with (not final, subject to change).
First, let's resume what is needed :
Geometry source can be anything : file stream (chunks), network stream, procedural, can have a cache layer, LOD, etc... As a result, I implemented a GeometrySource virtual abstract class (only thing that I need to better integrate : LOD).
Secondly, RenderGeometry is the way it's stored for rendering. It's also virtual abstract class as geometry caching could be implemented here.
It could seem a little bit overengineered but it allow to very easily allow the user to add caching, use network stream, etc... in a very intuitive way.
Then the Effect (equivalent to Material in some other engine) have a fillVRAM-like function (cf Yann L. Material thread) that handle copying the required component from GeometrySource object to RenderGeometry object.

Usually a simple GeometrySource would provide vertex/index counts, primitive type, vertex/index formats, several OutputStream of vertice (one for each vertex buffer stream) and a OutputStream of indice (I use stream to prevent copying everything twice in memory, so Effect read directly from the stream and thus reduce buffering). A simple RenderGeometry allow to set vertex/index counts/formats, and give a way to access to VertexBuffer/IndexBuffer object (abstract virtual in case of crossplatform engine).

Some issues to resolve :
* Shared buffers. I have to track them as they will have to be shared for RenderGeometry object too.
* Make some tracking/heuristic to try to quickly calculate if data should be split in multiple vertex buffers or not (i.e. pass 1 use TEXCOORD, pass2 use COLOR, should it be splitted in two stream or not ?).

Do not hesitate to give your critics/opinions on that system, keeping in mind my objective is to let the user as much freedom as possible.

Share this post


Link to post
Share on other sites
I haven't read through all the replies here, so excuse me if I repeat any info, but a while ago I implemented a generic vertex and index buffer for my engeine. The way they work is as such. First you determine a format, similar to how D3D does where you choose what data is represented in the vertex by bit flags. For instance, that would look something like:

#define MYFORMAT (VF_POSITION | VF_COLOR | VF_NORMAL)

Then, when you create the buffer you determine it's size as well. The vertex buffer is extended with an OpenGL or D3D implementation, so upon creation, the buffer will attempt to allocate video memory based on another flag you choose. Based on the components you require, the correct amount of memory is allocated in video and/or system memory (more on that later). The buffer can be allocated in one of three ways, dynamic, static, or local. Local allocation allocates in sys memory. Static is in vid memory, and dynamic allocates in both vid and local memory, so the memory can be accessed quickly by the CPU if you plan to read / change it frequently.

So, that takes care of the allocation of the buffer, now for the access. First off, all buffer access happens between a lock and unlock call, this is where the memory pointer is fetched from, or copied to the vid card if needed. My buffer makes use of two access methods. The first being a GetSafePointer method, where the user sends in the vertex format and the number of verts desired, and the app only sends back a pointer what you're asking for is with in the bounds of the buffer. This is the better faster way of doing things. The other way is that there is a fill method which is like fill( VF_COMPONENT, &CV3ctor3 data, unsigned int index ), which you can imagine is slower, but is neat because the app doesn't have to know what's in the buffer to fill it. If you try to fill a component that's not in the buffer, it ignores it.

The final part of the system is who determines what components go into any given buffer. This is determined by the material being applied to the geometry. If for instance, lighting isn't applied to the model, then normals will not be present in the buffer.

Hope that helps some.

Share this post


Link to post
Share on other sites
Quote:
Original post by acid2
Pretty sure the OP doesn't want info on a material system, I think he rather needs an abstract way to represent vertex/index data so that it is API agnostic.



From my experience a decent material shader implementation is the key.

A system described as above and in the thread I mentioned adds a lot of flexibility to the way you treat meshes and geometry in general

And as you see in the thread the way of passing geometry data is really dependent on the shader class you use

Share this post


Link to post
Share on other sites
The problem I'm having right now is how to manage all buffers. I have X lists of buffers (one new list for each new type of vertex that is to be stored). Each lists contains Y buffers of the same type. The client programmer tells the engine: I have these vertices of this type, store them for me. The engine gets the appropriate lists of buffers, checks if the vertices fits in the one with most space left. If not the engine creates a new buffer (with a default or user-defined size), fills it up and adds it to the list (returns geometry or a "handle" to the vertices). Now it gets more complicated when geometry is removed/disposed. I need to keep track of the chunks of free space left in the buffers - which is a nightmare. I'm looking for an easier solution. Any ideas?



Share this post


Link to post
Share on other sites
Do you batch the meshes to reduce the function calls to upload geometry to the VRAM(VBOs) ?

If so I would apply some sort of space partition and merge those meshes with the same materials

Share this post


Link to post
Share on other sites
Quote:
Original post by Basiror
Do you batch the meshes to reduce the function calls to upload geometry to the VRAM(VBOs) ?

If so I would apply some sort of space partition and merge those meshes with the same materials


I decided to give up trying to create some advanced kind of geometry manager. I ended up creating a vertex buffer class and wrapping the direc3d vertex buffer. It turned out pretty nice. I've got an abstract generic VertexBuffer class, a VertexBufferManager class that manages generic VertexBuffers and a VertexChunk class that acts as a handle to some vertices in a buffer.

So you create a vertex buffer manually and fill it with vertices.


PositionColored[] vertices = new PositionColored[36];
...
vertexBuffer = VertexBuffer.Manager.Create<PositionColored>(36);
vertexChunk = vertexBuffer.Write(vertices);
...
renderer.RenderList.Add(effect, parameters, vertexChunk);


The static property VertexBuffer.Manager is set by the render system (e.g. to a direct3d vertex buffer manager). I'm planning to add support for dynamic buffers later.

Share this post


Link to post
Share on other sites
Quote:
Original post by Opwiz

I decided to give up trying to create some advanced kind of geometry manager. I ended up creating a vertex buffer class and wrapping the direc3d vertex buffer. It turned out pretty nice. I've got an abstract generic VertexBuffer class, a VertexBufferManager class that manages generic VertexBuffers and a VertexChunk class that acts as a handle to some vertices in a buffer.

So you create a vertex buffer manually and fill it with vertices.


Of course the system I described earlier is "geometry management", it's a layer on top of the system you described. I separated VertexBuffer and IndexBuffer that are directly implemented in renderer (in a Render library) from the system that manages vertex and index buffer with loaded geometry, cache them, etc... (in a Engine library).
As a result you can also use the advanced management function of geometry in the engine as needed, and you can still use vertexbuffer/indexbuffer directly.

Share this post


Link to post
Share on other sites

This topic is 4222 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this