This topic is 5472 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Share on other sites
Quote:

1. There are different ways to implement this, i dont know james's way but he might have the system setup so that gc's are sorted into effects (per frame, visibility maybe) and then each effect is rendered, sent down to the queue/pipeline/batchController. He might be doing some sort of advanced batching.

2. Shaders do everything except draw... so it should go like this

initShader() -> called once on change - sets up effect globals, enables texture units, blending, etc
setupShader(gc/spgc) -> sets up specifics for this gc
fillCache(gc/spgc) -> caches the gc in VRAM
glDrawElements() -> draws the geometry

3. Dont use em

4. Send everything down, it'll already be cached in the vbos for the next pass. Currently not doing this since i am not fillrate limited.

btw there are different ways of implementing from Yann's threads, i got a method which is quite different from what was explained but still keeps the modular shader approach (can be loaded from dll's). I'm sure everyone that implemented a system did it a different way.

Share on other sites
Quote:
 1. I've read that some people's implementation of the shader-effect system (i.e. jamessharpe), have it set up so that the effect contains the offset table for the geometry vertex data. This struck me as unusual, because doesn't that mean that the effect knows about the geometry chunk. If the effect is in charge of the offsets then it means that the geometry chunk can only be rendered by specific effects, as other effects may have the wrong tables??? I was just thinking that it would be better if the geometry chunks stored their own offset tables for their interleaved data, and the effects can just query it for the different data stream offsets. Is that not more flexible?

I had also this problem. If you sort the stream of geometry only for one effect then I can't change effect in runtime because another effect can't change the vertex stream order in a geometry element. My solution is to mantain a big table(say table A) for the effect that give me the correct order of the stream and if a particular stream is needed(Like a yann and james solution) ,this table is an index of another small table (say Table B) that give me the position in geometry of particular vertex stream. With this method if you want to change the effect in runtime, you only need to change the Table B.

Share on other sites
Quote:
 Original post by nts1. There are different ways to implement this, i dont know james's way but he might have the system setup so that gc's are sorted into effects (per frame, visibility maybe) and then each effect is rendered, sent down to the queue/pipeline/batchController. He might be doing some sort of advanced batching.

Well, when a geometry chunk is rendered by an effect is turned into SPGC's for the shaders. So those SPGC's will be sorted by shader in each render RPASS_* I think? So you don't sort by effect, but by shader.

But also, say you did sort by effect, I still don't understand why the effect has the offset tables. Take two meshes, both rendered with the same effect, however one of the mesh's data also has tangents and binormals, where as the first mesh doesn't. Surely, it should be possible that the effect can render both meshes if the streams it requires are there... but if the offset table is in the effect, one of those meshes will produce incorrect data, as the offset table will be looking at incorrect data, right?

I dunno, but I think I'll go for the offset table in the GC route, it seems more flexible than leaving it to the effects.

I'll do something like davidino79 has done. The effect stores information about what data it needs and in what order/format, while the GC will store all the offsets and strides to get to this data.

Share on other sites
The offset table in the effect class really was just me jumping in and getting something that works. It also has the drawbacks you mention. So to solve this I've implemented a vertex description system to describe the format of a vertex. This is simply a class describing each element in the description nad its format. So now what I do is store a 'requirements' descriptor in the effect class i.e. those elements that are required to be rendered by this effect, and then store a vertex descriptor along with the geometry stream. Then I simple run a quick compatibility test when the effect ID is changed to ensure that it has the required streams for rendering. This could also possibly optimise the VRAM uploading.

Share on other sites
Thanks for replying. Yeah a vertex description sounds like a safe way of doing things, and comparing the vertex descriptions of the GC with the vertex requirements of the effect would be simple. I guess a similar system can be used for the shader-effect linking, with shaders having their requirements, and effects exposing the data they have access to.

Share on other sites
for 'potential' multipass effects (remember: a future shader might only require one pass instead of two, for example) I implemented a simple scripting sequence.

I composed effects of seperate 'attributes' such as the following:

.visual{	.texture.diffuse(unit0)        .color.diffuse}

then implemented the potential multipass blending as

.out.color = mul{ .texture.diffuse(unit0), .color.diffuse }

the engine then decides if multiple passes are required, and if so, uses alpha blending to multiply two seperate shaders together. if not, it informs the shader it should multiply the two attributes together through a simple 'blending mode' tree.

it provides maximum flexibility with the engine determining at runtime whether multiple passes are necessary.

Share on other sites
Quote:
 Original post by c t o a nfor 'potential' multipass effects (remember: a future shader might only require one pass instead of two, for example) I implemented a simple scripting sequence.I composed effects of seperate 'attributes' such as the following:.visual{ .texture.diffuse(unit0) .color.diffuse}then implemented the potential multipass blending as.out.color = mul{ .texture.diffuse(unit0), .color.diffuse }the engine then decides if multiple passes are required, and if so, uses alpha blending to multiply two seperate shaders together. if not, it informs the shader it should multiply the two attributes together through a simple 'blending mode' tree.it provides maximum flexibility with the engine determining at runtime whether multiple passes are necessary.

Hmmm... Sounds quite cool. I just use a "profile" approach, where each shader can have multiple profiles, each with multiple passes. Depending on the hardware available, and the compatability with the said effect, a specific profile is chosen at loadtime. It allows for newer hardware to only have one pass say, when older hardware needs 2 or 3, but it requires the coding of another profile in the shader. Not too much effort really, as it's all scripted.

Share on other sites
Another random question for you guys. Thanks.

Share on other sites
Quote:

For the shadow mapping shader yeah i would write a different one that has the shadowmap applied and one that doesn't simply because not all hardware support it, pre geforce3 so i would like a fallback (without passing parameters here and there) and not everything is going to be shadowmapped in the scene.

For fog, i would make it global. If a surface/material needs fog disabled then it would be the shaders job to check if it is enabled and disable it. Same if fogging is disabled but the shader needs fog, just enable it. I think it'll mostly be global though, shaders aren't gonna play with fog much.

• Game Developer Survey

We are looking for qualified game developers to participate in a 10-minute online survey. Qualified participants will be offered a \$15 incentive for your time and insights. Click here to start!

• 11
• 10
• 9
• 15
• 22
×

Important Information

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!