Render Queue/Scene Graph

Started by
5 comments, last by GameDev.net 17 years, 9 months ago
I want to make a graphics engine that will move most of the D3D/OGL code away from the game code. Unfortunately I don't really know what path to follow to achieve this. One of my goals is to avoid as many state changes as possible, so I thought of having the game's draw functions, let's say "DrawObject," not actually draw the object, but rather add the object to some sort of render queue or scene graph. The scene graph could be some sort of heirarchy that vertex buffers would get put into. Each node would convey a different rendering state, and the leaf nodes would be the actual vertex buffers to draw.

        pixel shader 1 --- texture 1 -- Vertex buffer
head -- 
        no pixel shader -- texture 2 -- Vertex buffer
                           texture 1 --lightmap2 -- Vertex buffer
                                       lightmap1 -- Vertex buffer
That should kind of describe what I mean. The only problem is that I don't know if this method would be too complicated or slower than some other method. I'm also not sure about adding all the objects that need to be drawn to this scene graph and then drawing them all at once, it has two apparent problems: 1 - How would the drawing order be preserved? I would want my particles and alpha blended elements to be drawn last, and the world/etc drawn first. This method wouldn't do that 2 - I'm not sure, but it seems like this might be slow because I'll be doing all the CPU work, then all the GPU, then all the CPU, etc. It seems like it would be more beneficial to have both CPU and GPU working in parallel So that's my problem. I don't really know how to make a decent graphics engine, so if you have any input I'd appreciate it greatly.
Advertisement
Download engines like Ogre, Quake 3, or other open source engines and see how they do things. Basically I think you need to read up a bit more on what a scene graph does as it would solve your problems if I'm reading right.

"Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety." --Benjamin Franklin

Well my method involves using a sort of tree structure. When an objects "render" routine is called, it is added to the tree. Each level of the tree separates based on some state change. There are levels for shaders, vertex buffers, and textures. then just traverse the graph as each level sets the appropriate state and renders all the objects.
Sean Henley [C++ Tutor]Rensselaer Polytechnic Institute
With what I have now (which, admitedly can change on a whim), I have a preRender, postRender and render function for each of the nodes in my scene graph. Whenever preRender is called for one of them, they register for any special rendering they need done (currently only reflections), and then 'render' draws the actual object.

So, the video driver will render all the things in the reflection list first, then everything else in the scene.

I know that irrlicht does something like this (where I got the idea in fact). They render reflections, shadows, transparent object, etc, etc all seperately, then they draw all the normal solid objects (or something of the sort).
[size="2"][size=2]Mort, Duke of Sto Helit: NON TIMETIS MESSOR -- Don't Fear The Reaper
The almighty Material/Shader implementation Thread also has some information on this.
Hey, thanks for the help guys. Sr_Guapo and Endar, you both seem like you have sort of similar systems. Both of you are talking about something very similar to what I had thought of implementing.

But actually, after reading parts of the Matrial/Shader impl. thread I'm beginning to see that I can do something similar to what Yann L was talking about on the first page, where everything is basically a shader. That way I can have a multi-textured object by simply applying a second texturing shader to it. Then I could also seperate objects in the engine at render time by certain properties, ie all solid objects could be rendered first, then render all the alpha blended objects, etc.

Would this be too slow, though? I know pixel/vertex shader changes are the slowest state changes in D3D. If I optimize it to sort objects so that the fewest state changes need to be made, it'll be faster, but will it be good enough? I'm really new to shader type stuff, I've never really used it.
It all depends on what you want to do, it shouldn't be too slow. I would seriously recommned to go for the D3DXFramework and forget about OpenGL for the time being, at least until you get a good grip of the concept.
The parameters you'll have to send to the GPU will most likely eat more power than the shader changes today, tomorrow you'll have to develop a whole new approach to deal with shaders -when you start generating them automatically-
Use Vtune when in doubt :)

-rod lopez

This topic is closed to new replies.

Advertisement