Renderer 2.0

Started by
2 comments, last by Krohm 13 years, 3 months ago
First off, thanks to those who take the time to read through this. My current design is little more than a wrapper for d3d and managers/caches for resources(textures/fonts/meshes/shaders/etc) I've come a long way from a single triangle on screen, and if I had any time to really work in blender, I'd probably have a few really nice demos to show off (with planar reflections and shadow volumes)

My current demos' render loops are typically something like this:

beginscene;set up states;set shader;render model(sets textures, and grabs vertbuffer of mesh needed, then calls dip)set another shaderrender another modelset yet another shaderrender gui treeend scene.


This allows me to do pretty much what i want in each demo, but i have to worry about shaders/states far more then i would like, and things like shadows/reflections need to be handled for each object drawn. One reason for this is that my vertex structures are hard coded. So...

#1 I'd like a little guidance on how the shaders can dictate the vertex format, as opposed to the vertex format dictating what shaders can be used(and requiring the user to set them up each frame)


I'm thinking of composing a queue of rendercalls, perhaps created from a scenegraph, which will allow sorting of calls by shader, material, texture etc, and also build instanced dip calls where possible. so a command would contain a mesh, shader, material params, and a world transform, and flags for whether or not the object is a shadow caster, or emits light, is alphablended, etc

#2 Does anyone have any advice for implementing such a system, or see any pitfalls that i might encounter?


I can't think of anything more to ask atm, but will return when i do... in the mean time, thanks again.

[edited by Phantom: removing code tags to stop long text segments breaking forum layout]

[Edited by - phantom on December 27, 2010 1:36:39 PM]
Advertisement
Quote:Original post by Burnt_Fyr
#1 I'd like a little guidance on how the shaders can dictate the vertex format, as opposed to the vertex format dictating what shaders can be used(and requiring the user to set them up each frame)
I'm also interested in an easy way to do this. I could tell you how I tried it but be warned: it's overcomplicated, prone to break easily and likely very far from the best. It also gives some issues with some subsystems.

What I did (initially) was to write a stupid disassembler which would have inferred the vertex format from the vertex shader. I then played a bit with some additional variations of this component to the point of turning it in something more like a reflection layer and even tried to turn it in a virtual machine.
Problem is, the shader does not allow you to figure out the vertex format. At least not in D3D9. It allows you to figure out what the "vertex shader input" expects but not how to layout those input components in memory. Questions like "what stream port to use for OPOS? For TEXCOORD?" are left unanswered by the VS code and requires context-dependant information.
I had a try at it with "attribute groups". Do they work? It's still to early to say. By sure, providing this information back to the systems generating the geometry required some work and, by sure, it still feels quite fragile.
Quote:Original post by Burnt_Fyr
I'm thinking of composing a queue of rendercalls, perhaps created from a scenegraph, which will allow sorting of calls by shader, material, texture etc, and also build instanced dip calls where possible. so a command would contain a mesh, shader, material params, and a world transform, and flags for whether or not the object is a shadow caster, or emits light, is alphablended, etc
I've heard something about that, it seems to be a very flexible method, I think maybe God of War uses it. I personally believe that instanced DIP calls should be managed explicitly, but that's probably just me, IMHO, either the mesh is instanced massively or it is unlikely to be a problem anyway).
Quote:Original post by Burnt_Fyr
#2 Does anyone have any advice for implementing such a system, or see any pitfalls that i might encounter?
Yes. Please make extra special sure you have a clear idea of what you need to support and what is your goal. Trying to support "arbitrary" shaders will drive you crazy.

Previously "Krohm"

Krohm, thanks for the lead on GOW, I found this at Christer's blog.

Quote:
Yes. Please make extra special sure you have a clear idea of what you need to support and what is your goal. Trying to support "arbitrary" shaders will drive you crazy.

That's what this thread was about, coming up with a solid plan for supporting my wishlist for this version of the renderer, and finalizing the wishlist into a set of specifications. As far as where to handle instancing, ie: game vs engine; I plan on testing this on multiple machines to get a baseline of when it's advantageous to instance, and then allow this value to be adjusted to the client computer.

Quote:Original post by Burnt_Fyr
Krohm, thanks for the lead on GOW, I found this at Christer's blog.
Hey, that's it!
Quote:That's what this thread was about, coming up with a solid plan for supporting my wishlist for this version of the renderer, and finalizing the wishlist into a set of specifications.
My best suggestion is to look at the assets in detail and figure out what you need. Depending on when the assets were generated, you might end up having no more than a dozen (quite similar) kernel sources, with as little as 3-4 vertex formats.
Add a couple of formats for text and GUI stuff and we probably have enough for something good looking.

You might have to scratch your head a bit more on how to "instance" those sources, in other words, how to fetch the uniform values and the textures.
If you use .FX, you're pretty much set as it already contains a lot of stuff that can work for you. In that case, all it takes is to add the necessary glue to fetch stuff as the engine-specific values such as a "fade to black" parameter.
WRT that, recall that in GCC or editor tools materials are generally "static" in nature while they often need to be "dynamic" when the game is running (a typical example being the "water ripple" shader, which needs time parameterization). This introduces the problem of managing the parameter buffers accordingly.

Getting to know your data will take a while. It took me about two weeks to enumerate all the materials, look at them in detail and figure out an elegant way to deal with them... the process is a bit backwards if you think at it, but without a target to hit it is easy to just miss the point and grow an overcomplicated system.
Consider an iterative approach.

Previously "Krohm"

This topic is closed to new replies.

Advertisement