I think of having a set of 'effects' like bloom, blur, motion blur, dof - and would like them to be applied at anytime during the game like:
What i think can be done is:
<<preprocessing rendering>> - Render frame to textureA <<postprocessing>> - iterate through effects in std::vector - for every effect - bind textureA - pass in the textureA as the sampler - run effect shader on a quad with textureA bound - store the results in textureA - unbind textureA <<final output>> - render full screen quad with textureA bound
This approach will in final return a texture based on a sum of all effects output that were in the std::vector.
Not bad, but will require me to add effects carefully as the position of the effect on the std::vector is very important.
How do you think?
Any better tested solution ?
Edited by xynapse, 07 May 2012 - 05:10 AM.