Sign in to follow this  
Followers 0
Basiror

shader system implementation

45 posts in this topic

Hi I have read some threads about how to implement a decent shader system especially this one they suggest a method using dlls to keep the system flexible they export a base class and implement it in the dll I wonder how I could handle several shader dlls with implicit linking(without loadlibrary("xxx.dll") From what I understand this isn t possible that way since it would always choose the dll it finds first so you probably don t get the right one one solution I could think of, actually more a hack than a solution ENGINE: export baseclass SHADERMANAGER_DLL: import baseclass export derived class #n getprocedure( return instance of derived class #n) SHADER#n_DLL: import derived class #n implement derived class # n this way the SHADERMANAGER_DLL would be implicitly linked and would implicitly link to the SHADER#n_DLL the getprocedure would return a instance of a shader implemented in a dll any idea if that works? a more straight forward solution would be explicit linking which allows you to choose which dll to load at runtime what way would you suggest/propose? i would like to allow modders to implement custom shaders
0

Share this post


Link to post
Share on other sites
Part of the appeal of the mentioned system comes from being able to 'patch' the game with additonal shaders after release, by simply creating a new dll. This implies you need to check at startup what dlls there are and dynamically load them (with LoadLibrary on windows).
0

Share this post


Link to post
Share on other sites
Ok that seems to be the easiest and straight forward way to handle this problem


How would you process with the implementation?
give each shader class a information struction describing its capabilities
and put the render code into the shader class?


so the mapper specifies which shader implementation to use
so the engine loads a map and finds a shader that request implementation "sm3.0" for example

now the shader manager searches for the dll that implements the sm3.0 shader


then
- sort all geometry to render by shader
- get a instance of the shader implementation
- render(geometry){shaderinstance->setstates(); shaderinstance->draw(geometry);shaderinstance->unsetstates()};

proceed until everything is rendered


or would you suggest a different approach?
0

Share this post


Link to post
Share on other sites
It's probably similar to what you described yes. Although I think most people who have implemented this stuff have added an extra level of indirection. The mapper does not specify which shader he wants exactly, but the resulting effect he wants to create.

At engine startup the renderer parses all the shaders and sorts them according to the effect they implement, and the quality of it. All shaders that aren't supported by the hardware can be unloaded again.

Then at rendertime, find the effect, find the associated shader (this would normally be the best one, but according to quality settings, it might be a lower quality implementation of the effect), and then call that as you described.

You'll also want to take a good look at this thread.
0

Share this post


Link to post
Share on other sites
Maybe packing shaders into DLL-s isn't very good idea.
Why should you do that, for example?
It gives freedom to the point, one can build very custom vertex declarations, and prebuild vb/ib-s, but... this is not exactly responsibility of a shader...
Shader shades geometry, whatever it is.
Packing, or even creating geometry maybe is not quite it responsibility. It is responsibility of some renderer subsystem.
Imagine this - you released a game, and want to enhance it with grass... You build a grass-shader and release its DLL... but where to put that grass? Well...
Ok, you have the grass, but it is lame, and you want to make it cool. You write a shader, and release it... Is it pracital? I'd say - no...
This is just an argument, against a system described in threads mentioned - do not waste your precious time, implementing such monsters, especially when there is no need to.

Implement:
1. Shader compiler system (based on preprocessor), that can get a shader source, and create a family of shaders from it. For example, try to encode more parameters into single shader source, like light variation, fog variation, quality variation, etc.
2. System of managing shader parameters, and uploading shader constants to the shaders. (Setting parameters each frame vs binding pointers to shader constants)
3. Effect system, that describe a group of shaders, for each situation in which an object must be rendered - for different light scenarios, fog scenarios, shadowmap, quality, renderpath, etc...
4. A really advanced system would be, some way to let your artist control/create shaders. System can vary from relatively simple shader parametric system, where for each shader written we have some set of parameters (fog: linear, height, linear+height), (lighting: diffuse+ambient, lerp( diffuse, ambient ), diffuse-only, ...), etc. Artist can just choose options from combo-boxes, then we set proper defines to the compiler and produce a shader.
More advanced system would be composing shader from primitives/blocks, like UE3 shader editor/Offset software shader editor for example.

My 2c.
0

Share this post


Link to post
Share on other sites
Hi,
Its not a wise idea to implement effects as DLLs. Effect is just another type of resource. Doing this is like trying to load textures as DLLs.

If using DX, try the D3DXFramework. It can compile the effects at runtime and if you create a effect manager, then you can load and unload your effects at will and setup them with some simple params.

Luck!
Guimo
0

Share this post


Link to post
Share on other sites
I think what you're looking for is an effect, not shader, system. Basically, something that will define:

1)How many rendering passes will there be
2)The initial states(matrices,lights,textures,blending modes...) for each pass
3)The shaders that the pass will use

IMO, if you're after easy modding, it's very risky to require modders to compile a dll that will be linked with the main .exe. They have to be very careful to get it right. Different compilers or settings(aligment,padding,etc...) may result to big problems.

I think the best solution by far is to use some sort of scripting. Then the game will be modded by just editing a text file. If you're using D3D, then the D3DXFramework is just fine. Otherwise, you can use Lua or SmallTalk or something like that(even boost::spirit?), or even implement a scripting system yourself. It doesn't have to be anything fancy, the most complex thing you need to do is set the states. You can probably pull it off with some sscanf() commands.
0

Share this post


Link to post
Share on other sites
I think you missunderstood the concept behind this implementation

the shader dll doesn t implement the effects you want to implement it implements the functionality you offer for shaders

e.g.:
a grass shader with (normal grass, boring grass, animated grass)
lets say you work with an older graphics card and implement these shaders mentioned above
they all provide some properties used to render them
now you switched to a modern graphic card and you want to update the graphics quality you just release a new dll with a higher priority to provide the functionality used by the shaders, just in a modern manner maybe with enhanced features
the shader dll itself runs a test on you system to check if everything is available otherwise the system chooses the old dll

this allows you to specialize a generalized representation of a effect for certain graphic cards without rewriting and updating a ton of code

the shader dll itself just does the state processing
the effect properties are still seperated in a effects/shader file


and i don t use D3D and i would like to stay independent from APIs so the effects framework of D3D is no option in my eyes
0

Share this post


Link to post
Share on other sites
Quote:
Original post by mikeman
I think what you're looking for is an effect, not shader, system. Basically, something that will define:

1)How many rendering passes will there be
2)The initial states(matrices,lights,textures,blending modes...) for each pass
3)The shaders that the pass will use

IMO, if you're after easy modding, it's very risky to require modders to compile a dll that will be linked with the main .exe. They have to be very careful to get it right. Different compilers or settings(aligment,padding,etc...) may result to big problems.

I think the best solution by far is to use some sort of scripting. Then the game will be modded by just editing a text file. If you're using D3D, then the D3DXFramework is just fine. Otherwise, you can use Lua or SmallTalk or something like that(even boost::spirit?), or even implement a scripting system yourself. It doesn't have to be anything fancy, the most complex thing you need to do is set the states. You can probably pull it off with some sscanf() commands.


I disagree with your definition of an effect. Personally, to me an effect has absolutely nothing to do with how the effect is implemented (so, render passes and so on). All the effect does is provide resources to implement the effect - so for instance, it says "You have these streams and uniforms available to create this effect", some form of shader (I'm not talking about GLSL or co) would then implement the effect however it likes.

I personally went with the scripting approach, whereby I have an effect file - which says what effects there are (with a little description of what that effect should do), and what resources are available to implement that effect. I then have a series of shader files that have a number of profiles ( to support different types of hardware ), with different compliances with how well it supports the said effect. Each profile can then implement the effect in as many passes or using whatever tech it needs. At load time the effects are resolved to a profile in a shader based on what the hardware supports such that the profile with the highest supported compliance is used. Multiple shaders can support the same effect - but the one with the highest compliance will always be used, so patching using a new shader is easy.

There are disadvantages to using a script based system - one big one being you can't really do arbitary processing.
0

Share this post


Link to post
Share on other sites
Although the DLL shader approach works really well in practice, and is very robust and extensible, I was always thinking of ways to get rid of it someday. It's complex, platform dependent (you need to distribute your shaders specifically compiled for each platform), and doesn't really allow for rapid prototyping and visual shader development as easily as I would like to.

Unfortunately, no alternative (especially primitive script or shader program based ones, like the DX effect framework) offer the extreme flexibility and scalability of the DLL based effect-shader system. So, currently we're still using it extensively. Our current system uses over 60 shader DLLs, with more than 300 shaders. And still, it is lightning fast.

But the future lies in a different approach: JIT compiled shader meta languages. Right now, the resources needed to render a specific effect through a shader are almost always explicitely defined. This has to change: the shader compiler should extract, optimize, and request all needed resources on his own. Things like the number of passes shouldn't even appear in the code anymore: a render pass, a shader program or an attribute are just resources, much like a CPU register or opcode. An appropriate compiler must automatically recognize and optimize the use of such resources, and request them from the render manager on demand.

Approaches like the Sh meta language go into the right direction, but are still not flexible enough.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Basiror
e.g.:
a grass shader with (normal grass, boring grass, animated grass)
lets say you work with an older graphics card and implement these shaders mentioned above
they all provide some properties used to render them
now you switched to a modern graphic card and you want to update the graphics quality you just release a new dll with a higher priority to provide the functionality used by the shaders, just in a modern manner maybe with enhanced features
the shader dll itself runs a test on you system to check if everything is available otherwise the system chooses the old dll

this allows you to specialize a generalized representation of a effect for certain graphic cards without rewriting and updating a ton of code

the shader dll itself just does the state processing
the effect properties are still seperated in a effects/shader file


People all over the world update their graphics, by releasing patches, which include executable and the art needed.
Making DLL system just for the ease of updaing the graphics is little pointless, because the typical executable size is nothing to download today, and provides a lot more freedom to implement visual features for modern cards (if we are talking about that) - no shader can ADD new features, nor modify quantities of older ones (say adding light shafts, fog layers, spider-webs over the walls, etc.).
IF you go that way, the executable patching is the way, DLL-patching is half-the-way, not-really-there.

A system with priorities looks dangerous. Tweaking parameters by hand, and overriding priorities looks dangerous.
We create shader, set a priority and hope it will be resolved.
Better is to explicitly put the new shader, in the effects we want with the priority we want - it is more stable.
0

Share this post


Link to post
Share on other sites
@Yann L:
How do you solve the problem with the big number of shader permutations?
With PS30 HW you can put 4 or more lights(with shadowmaps) into one rendering pass. But each light/fog should also have its own shader fragment: point light, spot lights, lights with a cubemap, volumetric fog. Combined with some material shaders the number of combinations grows exponentially.
Precompiling is impossible but creating on the fly is also slow. Cg's interfaces are nice, but they don't really help here, because the cause a recompilation. Generating asm shaders is doable, but you lose specific compiler optimizations(and it's even impossible with glsl).


0

Share this post


Link to post
Share on other sites
Quote:
Original post by LarsMiddendorf


One working system that I know of, just compiles all combinations in preprocess, search unique resulting shaders and removes all duplicates.

0

Share this post


Link to post
Share on other sites
Well as mentioned above i don t want to encapsulate the shader itself into the dll
i would rather implement a set of features in the dll which makes use of latest extensions available
and in a shader file you define the effects you d like to achive with a description language

at loadtime the system checks which shaders to load,
checks the dlls for supported features and availability on the current hardware(e.g.: multi texturing with 2 4 8 16 layers *just an example don t rant me*)
and links the shaders with the dlls

the dlls just provide some extended functionality you offer the artists which on the other side could not be implemented with a scripting language

of course i could recompile a executeable and upload it, but no modder can customize the rendering pipeline to fit his needs and i certainly don t want to release my engine source only the gameplay components which are encapsulated into a seperate dll as you know it from halflife1

light shafts, spider webs or what ever aren t features that are implemented by a shader, light shafts belong to your sky model implementation in my opinion

effects aren t shader related either,

for me a shader defines
- material properties, effect name on bullet hit
- texture names ....
- blending operations
- parallax mapping dot3 bumpmapping and such

effects:
- name of effect
- particle effects
- sound effects
- other effect types

to implement the functionality of the shader as i defined it in the context of my engine, the dll provides a interface for the capabilities your hardware offers

you can imagine it as a renderer plugin with the little addition that everyone can customize it specialize it for certain hardware ....


as for the supply of information to render you could implement another system that allows the creation of additional render information such as a color array for each vertex in a mesh
all you had to do is to implement a api into your mesh class, or whatever you use to represent your geometry, that allows a preprocess to be run that generates the additional data so you shader can handle it

you could implement a preprocess inside your shader that runs the preprocess on the mesh

this is just an idea to extent the whole think without touching the executeable's code
of course you can t implement each gimmick you wish to that way but i allows for a lot of customization without much of an effort


[Edited by - Basiror on September 17, 2005 9:00:25 AM]
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Basiror
of course i could recompile a executeable and upload it, but no modder can customize the rendering pipeline to fit his needs and i certainly don t want to release my engine source only the gameplay components which are encapsulated into a seperate dll as you know it from halflife1


The patching by releasing entire executable was to show that releasing a DLL will not enhance top-level graphics in a way that releasing whole executable will.

About modding - do you really want your possible modders can alter shader database?
It can be done, without messing with executable code completely, just keeping shaders as source (or ability to load them that way too), and possibly cache compiled results later.
Or (even better), release the tools that let your artists create shaders! :)
0

Share this post


Link to post
Share on other sites
you still mix the shader and the dlls

the shaders are just a description of the surface and maybe vertex and pixel shader programs,

the dll however implements a high more abstracted part to provide additional functionality just as if you d link the SDL library to your c++ program and make use of SDL's build in functions to setup a window

thats a huge difference in my eyes
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Basiror
you still mix the shader and the dlls

the shaders are just a description of the surface and maybe vertex and pixel shader programs,

the dll however implements a high more abstracted part to provide additional functionality just as if you d link the SDL library to your c++ program and make use of SDL's build in functions to setup a window

thats a huge difference in my eyes


Yes, we use different terminology. Give an example, of one DLL and shader usage, like you see them to be used.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Zemedelec
People all over the world update their graphics, by releasing patches, which include executable and the art needed.
Making DLL system just for the ease of updaing the graphics is little pointless, because the typical executable size is nothing to download today, and provides a lot more freedom to implement visual features for modern cards (if we are talking about that) - no shader can ADD new features, nor modify quantities of older ones (say adding light shafts, fog layers, spider-webs over the walls, etc.).

Of course the DLL approach can do that - that's the whole point of it. And obviously it is not always possible to patch the executable for simple reasons such as application deployment (games are not the only sector where shaders are used), extensibility without access to the application or game source code (what would Max or Maya be without plugins), etc. The whole idea behind the DLL approach is flexibility. Patching the executable completely takes away that flexibility.

Quote:
Original post by Zemedelec
IF you go that way, the executable patching is the way, DLL-patching is half-the-way, not-really-there.

Executable patching is often not an option, and to be frank, it is a pretty primitive way of extending functionality of your software. See Plugins.

Quote:
Original post by Zemedelec
A system with priorities looks dangerous. Tweaking parameters by hand, and overriding priorities looks dangerous.
We create shader, set a priority and hope it will be resolved.
Better is to explicitly put the new shader, in the effects we want with the priority we want - it is more stable.

I think you didn't quite understand the system. There are absolutely no stability concerns, the system is stable in itself. In fact, it is much more stable and reliable in selecting an appropriate shader than a manual approach could ever be. Especially on large scale system, with many shaders.

Quote:

How do you solve the problem with the big number of shader permutations?

They're generated and compiled on the fly right now. One must be careful about terminology here: many people use "shader" as an equivalent of a "shader program". That is incorrect. A shader is a structure, visual description and algorithm to simulate a certain visual appearance. Part of a shader can be a shader program (eg. a GLSL shader), but this is not mandatory.

Taking this into account, shaders will never suffer permutation problems. Shader programs will - and those can be auto-generated by their corresponding parent shader. On the fly compiling isn't ultra fast, but this was never a real issue in my experience. Takes at most a couple of seconds, when the effect system resolves the shaders.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Zemedelec
Yes, we use different terminology. Give an example, of one DLL and shader usage, like you see them to be used.



shader <shadername>
{
stream(vertex,normal,color,blendfactors);
.....
}




now lets say you simply render your scnene with pure vertex arrays and basic opengl lighting so the shader above fullfits your needs

however the auther wants to use dot3 bumpmapping so he has to tell the render what he needs


shader <shadername>
{
stream(vertex,normal,color,blendfactors);
enable dot3_bumpmap.
normalmap "some.tga"
.....
}



the DLL features dot3_bumpmap and normalmap and knows what to do with those keywords
in this case the dll and the shader are linked *hence: implicitly*
however another approach would be to specifiy the dll's name within the shader to make guaranteed use of a certain dll

so in your mesh you have to run a preprocess or you run the initialization the first time you render a mesh although this might lead to lag spikes

the TBN needs to be calculated ....

this is just an example on how i would use it in addition you could store seperate streams created by the preprocess of the dll
the whole concept of interaction of geometry with the additional functionality still needs to be outlined in a straight forward and clear way but thats the basic principle i thought of


0

Share this post


Link to post
Share on other sites
Quote:
Original post by Yann LOf course the DLL approach can do that - that's the whole point of it. And obviously it is not always possible to patch the executable for simple reasons such as application deployment (games are not the only sector where shaders are used), extensibility without access to the application or game source code (what would Max or Maya be without plugins), etc. The whole idea behind the DLL approach is flexibility. Patching the executable completely takes away that flexibility.


I was talking about the shader-into-DLL approach, not general DLL concept.
No doubt, updating a part of a system, rather than whole system is more flexible, if implemented.

As for DLL updating as is, a rant:
Software, that do that and have plugin structure have serious requirements to do so - it is developed often by large teams, that create/update parts independantly. And for other concerns, like stability.
Nothing of that concerns the typical game scenario, games imho are not so big software.

Quote:
Original post by Yann LExecutable patching is often not an option, and to be frank, it is a pretty primitive way of extending functionality of your software. See Plugins.

If we target system complexity, that could be good point to start... :)
My personal decidion would be not to trade the design/implement time for pluggable system that can update everything, for the sake of uploading 1.5M more. I would go for stable-old (KISS) solution... but its just me, yes.

Quote:
Original post by Yann LI think you didn't quite understand the system. There are absolutely no stability concerns, the system is stable in itself. In fact, it is much more stable and reliable in selecting an appropriate shader than a manual approach could ever be. Especially on large scale system, with many shaders.

I agree with one - this system will ALWAYS produce an output for asked shader. Be it solid color, at the end.
The thing I don't like about it, is that problem: I want to manually control the fallback of given shader on lower systems for effect A, and for effect B differently and explicitly. Keep good specular on one of them, but remove AO for one, and opposite for the other.
What I understand from the system you described in the oh-so-long thread, was that it will break the description of the shader into a set of smaller compiled and running on that hardware shaders, with the highest priority.
Can't see how this is more predictable and easy-to-develop for a typical not-shader intensive game. For me, it's like having quite many renderpaths, and being unable to guarantee the visual quality in one of them. Maybe it is suitable for software, but for games - can't agree.

And for the sake of "graphic theory" nature of the forum, I totally agree with the post about shader abstraction beyond the passes and profiles.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Zemedelec
I was talking about the shader-into-DLL approach,

Me too. The plugin shader concept is fully able to do what you desribed.

Quote:
Original post by Zemedelec
As for DLL updating as is, a rant:
Software, that do that and have plugin structure have serious requirements to do so - it is developed often by large teams, that create/update parts independantly.

And games aren't ?

Quote:

And for other concerns, like stability. Nothing of that concerns the typical game scenario, games imho are not so big software.

Excuse me, but this made me laugh ! Look, the software industry, and the game industry even more so, is a very competitive market. Budgets are becoming larger every year, and the code complexity of games increases. But the timescale to finish a game gets shorter and shorter. The market forces this.

In order to survive, you need to find innovative solutions to increase both quality and productivity. Of course, you could code a game the "old school way", with hardcoded and manually optimized render paths for each effect and each target hardware. It would probably even take less time than creating a flexible plugin based approach, if starting the latter from scratch. So you sell your game, and everything is nice and fine - until a year or two later, when you need to release the next game. Unfortunately, hardware has changed a lot, so you need to completely rewrite your hardcoded engine: new effects, new hardware features, shifted bottlenecks, new shader structuration. If, however, you invested the time into the plugin system, updating your engine to the newest standards is a breeze, without even sacrificing backwards compatibility for older hardware.

Reusability is the key word. Todays and tomorrows development must target reusable frameworks that are easily extendable and scalable over time, maybe even by completely different development teams. Think third party licensing, for example. With your hardcoded solutions, you won't go anywhere in the future, especially not from a financial point of view. Until you have adjusted your hardcoded shader paths to the new hardware requirements, your competitor has updated a few DLLs (or static libs) and is already selling his brand new eyecandy ladden game.

Quote:

If we target system complexity, that could be good point to start... :)

Don't underestimate the complexity of a modern game.

Quote:

My personal decidion would be not to trade the design/implement time for pluggable system that can update everything, for the sake of uploading 1.5M more. I would go for stable-old (KISS) solution... but its just me, yes.

It's not about filesize, it's about competitivity and scalability.

Quote:

Can't see how this is more predictable and easy-to-develop for a typical not-shader intensive game.

In what time are you living ? Typical not shader intensive games ? All new 3D engines targeted at Dx9/10 cards, XBox 360, PS3, etc, almost drown in shaders ! The time of not-shader intensive games is long over.

Quote:

For me, it's like having quite many renderpaths, and being unable to guarantee the visual quality in one of them. Maybe it is suitable for software, but for games - can't agree.

I think I have the better argument here: I actually have such a system running on a commercial base for a couple of years now :) And it works very well. No, it might not be for a game - but our software has very similar requirements compared to a very high end game from the graphical side. In fact, you could probably turn the application into a game quite easily, if you have the artwork, change the interface and add AI.

The current system might not be what we would like to see in the medium term future (as I mentioned above, we're looking more into meta shaders), but we will certainly not go back into the stoneage of hardcoded renderpath graphics development.

My suggestion: just try it out before bashing it. You might be surprised about what extreme flexiblity it can offer you.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Yann L
Me too. The plugin shader concept is fully able to do what you desribed.


So, DLL-based shaders are able to interact to the point of adding new features to the world, like grass, light-shafts, etc....?
Or I missed something?

Quote:
Original post by Yann L
And games aren't ?


Most games - aren't. *Some* of the licenseable *engines* are. But their business modell just requires it, hands down.

Quote:
Original post by Yann L
In order to survive, you need to find innovative solutions to increase both quality and productivity. Of course, you could code a game the "old school way", with hardcoded and manually optimized render paths for each effect and each target hardware. It would probably even take less time than creating a flexible plugin based approach, if starting the latter from scratch. So you sell your game, and everything is nice and fine - until a year or two later, when you need to release the next game. Unfortunately, hardware has changed a lot, so you need to completely rewrite your hardcoded engine: new effects, new hardware features, shifted bottlenecks, new shader structuration. If, however, you invested the time into the plugin system, updating your engine to the newest standards is a breeze, without even sacrificing backwards compatibility for older hardware.


Emotions aside - where did I point that "my" approach is hardcoding anything anywhere...?
I suggested (above on this page), that it is enough for a shader system to be data-driven, and not become plugin-based code-driven. It can adapt to different hardware, and different scene requrements quite nicely. It can declare & use unique resources like its own textures quite well.
Currently my implementation can not declare new complex vertex declarations, where things are packed crazily, but I can't see how your system will do that either - creating very special geometry, like grass/clouds/particles where vertices have very special format, beyond (texcoordN, colorY,...).

Quote:
Original post by Yann L
Reusability is the key word. Todays and tomorrows development must target reusable frameworks that are easily extendable and scalable over time, maybe even by completely different development teams. Think third party licensing, for example. With your hardcoded solutions, you won't go anywhere in the future, especially not from a financial point of view. Until you have adjusted your hardcoded shader paths to the new hardware requirements, your competitor has updated a few DLLs (or static libs) and is already selling his brand new eyecandy ladden game.


Again, I fail to see where I present my solution as hardcoded... :)
And, by the way - have you seen some of the licenseable engines, that have leaked somehow. Those of them, that are based on games, have quite a lot source that is not plugin-based, nor is very clean.

Quote:
Original post by Yann L
Don't underestimate the complexity of a modern game.


I don't. Complexity of games it high, but it is destributed very wide between many, very different components.
As for the code-size, it rarely comes even close to the complexity of modeling packages for example.


Quote:
Original post by Yann L
In what time are you living ? Typical not shader intensive games ? All new 3D engines targeted at Dx9/10 cards, XBox 360, PS3, etc, almost drown in shaders ! The time of not-shader intensive games is long over.


I'm talking something like 30-40 shaders. Including new and old hardware. (Without new consoles, have yet to see them).
Such a quantity of shaders is quite enough for very large number of games.
With reasonably fixed lighting scheme. And know games, that used less, still looking amazing. Take for example World of Warcraft.

Quote:
Original post by Yann L
I think I have the better argument here: I actually have such a system running on a commercial base for a couple of years now :) And it works very well. No, it might not be for a game - but our software has very similar requirements compared to a very high end game from the graphical side. In fact, you could probably turn the application into a game quite easily, if you have the artwork, change the interface and add AI.


I never said, I don't believe in creation/existance of such a system. Not even, that it is flawed in some way. I personally tried to design and implement similar system, maybe 2 years ago. Now I'm not quite sure, that it's worth the efford and interations to fine-tune it.

Quote:
Original post by Yann L
My suggestion: just try it out before bashing it. You might be surprised about what extreme flexiblity it can offer you.


I think about enhancing our system now, but in a slightly other way.
My arguments against such a system?
- People that hipothetically licensed our engine, can add new shaders/effect quite easily - plugging something new is straightforward. Touching the source is needed only when new vertex declarations must be introduced.
- I need to make it easy to develop, thus competitive by making strong and efficiet art pipeline. Making artists happy, and let them touch and modify shaders, at will - and make it as intuitive, as possible. An artist can make wonders even with blending states. Programmer can rarely make something beautifull, even with SM3.

So, the main research area was for the opening the shaders for the art pipeline, and offload the pixels to the people, whom they belong - artists... :)
UE3 shader editor is good example of the new trend.

P.S.: One thing I forgot to comment - you said "shifted bottlenecks, new features".
HW generations change not so quick, and adopting to them is quite best to make in the core engine itself - I can't imaging how a tiny-little thing as a shader (shading scheme of a surface, at the end) can adopt itself to something like predicative-rendering for example. It will be needed to (a) redesigning some subsystems of the rendering engine, (b) design the system, already knowing what future will bring. I.e. - new generations always tend to cause rewriting/redesigning, as reality shows with many, many examples.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Basiror
now lets say you simply render your scnene with pure vertex arrays and basic opengl lighting so the shader above fullfits your needs
...
however the auther wants to use dot3 bumpmapping so he has to tell the render what he needs


Try one more example - a grass, that is represented ingame, as a 2D-array containing density/type, and (possibly) need to pack that into small, packed vertices, that hold things like vertex-offset from the center of grass quads, or sin/cos values, packed in .w components of positions for particles.
How these will be described?

Quote:
Original post by Basiror
the TBN needs to be calculated ....


The TBN isn't something you'll want to precalculate at load-time. It causes vertex-splits, thus making mesh (slightly) not so optimal, i.e. - after TBN-computation one would want to optimize the mesh, so making TBN-computation a preprocess step.

Even more important - developing for consoles, and making streaming engines, means load inplace resources - directly into memory, without any precious time for *any* preprocessing.

So, if design forces you to recompute such obvious preprocess things like TBN - it is not so good design.

P.S.: And, it is good to separate shader, from direct descriptions of its instance - so, that texture names, that will be unique for all the clients of that shader, must reside somewhere else...

P.S.2: And think about, wanting to render one mesh, with many shaders (at one time, and maybe during the gameplay) - what will you load, how will you process that mesh, and how many instances will you create?
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Zemedelec
So, DLL-based shaders are able to interact to the point of adding new features to the world, like grass, light-shafts, etc....?
Or I missed something?

No offense, but I would really suggest you try to understand the system we're talking about before discussing its supposed shortcomings.

To answer your question: of course the system can add these effects - that's the whole idea of a plugin system ! All effects in our current engine - lightshafts, grass, procedural vegetation, parametric terrain, water, fire, clouds, atmosphere, halos, billboards, fractals, and so on - are exclusively rendered by the use of plugin shaders. I even added several raytracing modules as shaders (for a test, because they were horribly slow ;), eventhough the underlying rendering model is completely different.

You seem to think that the plugin architecture merely mimics a kind of .FX file in code. Well, that would be rather stupid, wouldn't it ? Instead, it contains pluggable micro render cores.

As I said before: a shader is more than just a piece of GLSL or HLSL code. It's a system that describes the visual appearance of an object or effect. A shader can generate geometry and modify it. It can read from, create and destroy light sources. It can apply animation, LOD systems, or evaluate procedural and fractal geometry.

I think we're really talking about two completely different systems here.

Quote:
Original post by Yann L
I suggested (above on this page), that it is enough for a shader system to be data-driven, and not become plugin-based code-driven. It can adapt to different hardware, and different scene requrements quite nicely. It can declare & use unique resources like its own textures quite well.
Currently my implementation can not declare new complex vertex declarations, where things are packed crazily, but I can't see how your system will do that either - creating very special geometry, like grass/clouds/particles where vertices have very special format, beyond (texcoordN, colorY,...).

It goes far beyond the vertex format. This is just a minor detail, and of course a plugin based approach can generate and convert between any vertex formats you can imagine. We even use it to decompress large amounts of vertex data through zlib on the fly, within a shader ! Try to do that with a data driven approach...

Maybe I don't really understand what you're doing either, so please correct me if I'm wrong, but your system sounds a lot like a Quake3 style engine to me. Sure, that works. But is it ready for the future ? Nope.

I agree that a full plugin system is a poor choice for a beginner, as the complexity to implement the framework is overwhelming. But for an advanced amateur (and of course for the professional developer), this will definitely pay off. It becomes more and more difficult for small businesses or indie game developers to keep up with technical developments in the hardware sector. A plugin based system can make this much, much easier.

Quote:

Such a quantity of shaders is quite enough for very large number of games.
With reasonably fixed lighting scheme. And know games, that used less, still looking amazing. Take for example World of Warcraft.

We seem to have a different definition of "amazing" ;)

Quote:

HW generations change not so quick, and adopting to them is quite best to make in the core engine itself

No, it isn't. That's pretty much the worst approach there is.

Quote:

- I can't imaging how a tiny-little thing as a shader (shading scheme of a surface, at the end) can adopt itself to something like predicative-rendering for example.

As I said, please read up on the system again before making incorrect assumptions. We are not talking about a simple surface description here !

Quote:

It will be needed to (a) redesigning some subsystems of the rendering engine,

That's exactly what the micro render cores do. Divide and conquer - you add features as they come in. Oh look, I read about this new displacement mapping shader in a research paper a few days ago. I would just write it as a plugin, compile it to a DLL, and copy it into my engines plugin directory. And voilà, that's it. Even if that shader would completely modify the standard render pipeline - because in my approach, there is no standard pipeline !

By avoiding to touch the core, you also avoid breaking other parts of your code as you add new features. You don't need knowledge about the engine internals either, everything runs over standarized interfaces. So new effects (even those that would require a substancial modification of the render pipeline in your system) can be added without hassle, by several different people, or be contributed by third parties.

So, plugins are a perfect middle way between old school unflexible pipelines, and the complete abstraction of the rendering system into meta shaders. Once we have well working meta shaders (we will probably need hardware supported JIT compilers for that), we can just trash the plugin approach. And I'll be happy about it, because the system has in fact several drawbacks. Just not the ones you were thinking of :)
0

Share this post


Link to post
Share on other sites
Lol, I hate this, after about a year of developing I'm close to completing my next incarnation of shader system, and I am being/have been convinved I've taken the wrong course of action. Again. Yay! [wink].
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0