Modern directx and effects framework
Moderators - Reputation: 46571
Posted 18 May 2014 - 03:41 AM
The Effects framework is just one way to organize and use shader programs inside your engine.
It gives you Effects, which have Techniques, which have Passes, which have shader programs and optional render state. It also gives you some reflection/meta-data features, and a system to set shader variables (uniforms, textures) by name.
That's a fairly good structure to borrow from if you instead built your own.
My engine's system is similar, except I don't have Effects - each file represents a single technique. Also, passes for me represent different parts of the scene rendering phases, such as opaque, alpha-blended, depth-only/shadow, g-buffer attributes, etc... and I didn't want to supply shader variables in the way that the Effects framework is designed to... -- i.e. my ideal abstraction for interactiv with shaders was different than what Effects would've given me.
Members - Reputation: 886
Posted 18 May 2014 - 07:03 AM
To be honest, I'm confused a bit - I have a couple of dx 11 books, I'm reading online tutorials and it all seems good and fresh. But due to recent changes in windows (i.e. remove of d3dx, shaders compilation out of the box etc.) all examples simply don't work and need refactoring. And it seems that only examples that can run without any changes with vs 2013 is a couple of templates on msdn code base for windows store. Can you advise actual resources? May be groups or chats?}
Members - Reputation: 3542
Posted 18 May 2014 - 04:54 PM
About D3DX: yes D3DX was (finally!) deprecated, however Chuck Walbourn made a set of project that could be very helpful to replace it: you can read more about here http://blogs.msdn.com/b/chuckw/archive/2013/08/21/living-without-d3dx.aspx
Here are the four projects (... and maybe they will turn five in Summer, with a new library replacing the old D3DXMesh)
Edited by Alessio1989, 18 May 2014 - 04:54 PM.
Members - Reputation: 1670
Posted 19 May 2014 - 01:34 AM
So, it's basically a wrapper around api states change, constants and buffer settins, right? What's the problem to, say, render all opaque objects, next change manually (via dx calls) necessary data/states and render particles, next render transparent objects? I have come from Adobe Flash 3d world, where we don't have frameworks like this, write shaders with some sort of assembler, and just have api calls (althought very limited). And I see many benefits of such manual coding.
Yes, it's just a "smart wrapper" and yes, you can completely ignore it and make your own system (or do everything manually all the time if you want).
I know the Effect Framework only from DX9 and I have no idea whether it changed significantly in DX10/11, but in DX9 it was quite a good system for people learning the programmable pipeline, because it made it everything very straightforward and simple. But as you started to understand better how it everything works, you usually realised that the Effect Framework may not be really optimal if you try to do some optimizations of device calls and the whole rendering process - because you really don't know what's going on under the hood with its automatic state changes, texture setting, samplers, CPU pre-shaders etc.