Multiple shaders

Started by
2 comments, last by mvtapia 17 years, 9 months ago
Hello everyone. I have a question about shaders. In my case I want to be able to use multiple shader in a game, im kind of new to shaders so please feel free to correct me. Im currently using the ID3DXEffect structure for doing this. Now I want to use multiple effects on my application, like lighting, blur, monochrome,.. etc. I'm not really sure how to do it. when I render an effect I have to draw a subset of a mesh. Now I don't know, If I have to do this for every effect. seams like a lot of overhead. The other thing is I want the effect to be in separed files so if I want to make a new game I just put a new fx file on the folder and have a script that loads the effect into the engine. The last part I've already done. And I loaded it two effects. They are active in the application. Now, I haven't tested rendering two different effects because I don't know how is the proper way to do it. When I render the second effect, like light, will the first effect desappear or will it persist. to make everything more clear. this is how it goes: set the texture do monochrome effect render submesh then set texture to the new effect do blur render submesh again will both effects get combined or I can't do that. And what if the second effect is a light. Any help will be great, thanx
Marco Tapia
Advertisement
Nope, from what i know it doesn't combine the effect. The second effect will overrides the first effects. Since when you render the second effect, the vertices that is fed into the shader will be recalculated as the blur effect, hence all the previous values of the monochrome effect will be gone.

A way to solve this is to use fragment linkers. But i don't really think fragment linkers can work since both monochrome and blur effect uses the Normals.
Maybe you could research up more on fragment linkers
I have a .fx file now that has different functions for doing Phong or Blinn lighting with the different texture states (wrap, clamp, mirror). I am also adding other options like the ability to add a normal map to a texture.

When I created different techniques for each rendering effect, I found that doing multiple effect->SetTechnique() and effect->Begin() calls each frame really slowed it down, even rendering just one mesh. So, I got rid of the techniques and made the different rendering options as unique passes within one technique. It is a LOT faster. Now I do one SetTechnique and Begin call per mesh per frame, and then do the BeginPass with the pass that has the effect I want for the subsets of a mesh. I have separate passes for drawing wireframe and drawing a point buffer for a mesh.

I looked at the fragments sample in the SDK, but don't want different vertex and pixel shaders for each mesh. Instead I have different .fx files that can be shared among meshes of a type.

Also, I have globals BOOLs that indicate whether the vertex buffer has texture coords and tangents/binormals. The inputs to the vertex shaders has all possible inputs defined, and I use the globals to determine if the vb has the optional inputs.

The effects framework is pretty flexible as long as you can work within the hardware restrictions of the VS and PS versions you want to support.

I found though that I cannot run with the DirectX debug mode set to maximum validation, because it will require that your vb perfectly matches your shader inputs.
--------------------------Most of what I know came from Frank D. Luna's DirectX books
Thanks for the reply, and sorry I took so long to reply.
Ok, so I can't do multiple effects without using fragment so I think It is best if I do something like DXnut said.
Now I have a question about this monochrome routing. I have a set of textures that I plan to use for different things in my engine don't know yet for what.
Now, I was wondering if I'm doing the right thing.
I'm trying to write an fx file in fxcomposer but since I'm learning I get one error that I know why is there but I don't know how to fix it. here is the fx

//------------------------------------// Game Engine Effects.//------------------------------------//------------------------------------// TECHNIQUE VARIBLES.//------------------------------------bool Monochrome;//------------------------------------// Lighting variables.//------------------------------------bool		lig_enable;int		lig_Num;float3	lig_Color;float3	lig_Pos;//------------------------------------// Texture Variables.//------------------------------------texture tex_Color;texture tex_Normal;	texture tex_Position;texture tex_Velocity;sampler2D g_samSrcColor = sampler_state{    Texture = <tex_Color>;    AddressU = Wrap;    AddressV = Wrap;    MinFilter = Point;    MagFilter = Linear;    MipFilter = Linear;};sampler2D g_samSrcNormal = sampler_state{    Texture = <tex_Normal>;    AddressU = Wrap;    AddressV = Wrap;    MinFilter = Point;    MagFilter = Linear;    MipFilter = Linear;};sampler2D g_samSrcPosition = sampler_state{    Texture = <tex_Position>;    AddressU = Wrap;    AddressV = Wrap;    MinFilter = Point;    MagFilter = Linear;    MipFilter = Linear;};sampler2D g_samSrcVelocity = sampler_state{    Texture = <tex_Velocity>;    AddressU = Wrap;    AddressV = Wrap;    MinFilter = Point;    MagFilter = Linear;    MipFilter = Linear;};//------------------------------------// MonoChrome Routines.//------------------------------------float4 LuminanceConv = { 0.2125f, 0.7154f, 0.0721f, 1.0f };float4 COLOR_MC ( float2 Tex : TEXCOORD0 ) : MONO_COLOR{	float grayScale = dot( (float3)tex2D( g_samSrcColor, Tex ), LuminanceConv );	return float4(grayScale,grayScale,grayScale,1.0);}//------------------------------------// Pixel Shader Routine.//------------------------------------float4 PostProcessPS () {	float4 tempTex;		if ( Monochrome )		tempTex = COLOR_MC();}//------------------------------------// Technique: TECH// Desc: Performes the technique for the // engine effects on a mesh.//------------------------------------technique TECH{    pass p0    {        VertexShader = null;        PixelShader = compile ps_2_0 PostProcessPS();    }}


well as you can see it pretty clear that the error comes when I call the COLOR_MC() function becuase this function does not take 0 arguments. Now, if I were to call this function from the technique like so:

PixelShader = compile ps_2_0 COLOR_MC();

It works fine. Why is that? I don't feed any arguments.
The other thing is, you notice

float4 tempTex;

now this is just something I kind of made up. Can I store the texture colors in there and return it when the shader finishes?
One more thing, after one part of the effect is process I need to send the already processed texture to a new routine to be processed again with a different effect. How do I do that?
Marco Tapia

This topic is closed to new replies.

Advertisement