Jump to content

  • Log In with Google      Sign In   
  • Create Account


Easy? Passing color through effects/shaders


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
11 replies to this topic

#1 cozzie   Members   -  Reputation: 1446

Like
0Likes
Like

Posted 25 February 2013 - 04:09 PM

Hi all,

 

I'm implementing a new shader system which basically does the following:

- render all mesh instances with one 'basic'/ uber effect (lightsources, transformation, single textured material and blending)

- render specific mesh instances again if they have an additional effect assigned to it

 

The basics are in place and working, but I can't manage (yet) to pass through the end result of one effect/ shader to the other.

 

=> I have a vertex declarion with position, normal and texcoord (no color)

=> the 1st vertex + pixelshader delivers a specific color

 

=> the 2nd vertex + pixelshader doesn't 'know' the result of the 1st effect/shader

 

I've tried to add the color component to my vertex declaration, but that introduces trouble for my source models/meshes, because they don't have a 'color' component for the vertices.


What I also tried is to have the 2nd effect only have a vertexshader, no pixelshader.

But that runs into errors that it's looking for input in the shaders that belongs to the 1st one (maybe something to do with my code, looking at the wrong effect?)

 

Can someone give me some pointers how to achieve this?

All help is really wellcome


Edited by cozzie, 25 February 2013 - 04:11 PM.


Sponsor:

#2 phil_t   Crossbones+   -  Reputation: 3202

Like
0Likes
Like

Posted 25 February 2013 - 06:27 PM

I'm still not sure what you're trying to do. What does this mean?: "I can't manage (yet) to pass through the end result of one effect/ shader to the other."

 

 

Are you trying to "pass" the color of a rendered pixel? Typically this would be accomplished by having your first effect render to a render target. Then the second effect samples from the render target (texture) with the appropriate texture coordinates. Is that what you're trying to do?



#3 C0lumbo   Crossbones+   -  Reputation: 2119

Like
0Likes
Like

Posted 26 February 2013 - 02:15 AM

As phil_t says, generally in a pixel shader there's no way to access the colour contained in the render target you're rasterising to in DirectX9 (although I'm aware of at least one console that let's you access this, which is a pretty cool feature). In DirectX9 the only way you can do manipulations with the render target's colour (aka the 'destination colour') is through the blend stage part of the pipeline, which is a fixed function part of the pipeline.

 

Now, you could do what you want with a render-to-texture approach, but that's quite fiddly. Probably what you should do is either figure out a way to blend the results of your two shader pairs through the blend functions (possible if it's only a simple operation), or boil down the two pairs of shaders into a single pair of shaders.



#4 cozzie   Members   -  Reputation: 1446

Like
0Likes
Like

Posted 26 February 2013 - 02:23 PM

Thanks for the reactions.
Phil, your assumption is correct. The asnwer you gave might be the solution: rendering to a render target first, then 2nd effect/shader with the rendertarget as "input".
Although combining to 1 shader/effect instead of 2 sounds a lot easier, definately compared to rendering to texture and blend the results. For now i'll go "the easy way", which doesn't hinder me for what i want to achieve at the moment.

This does make me curious though how game studios/ developers manage this. Hearing that games use 'numerous' shaders. I can understand this from the aspect that i now also have about 20 with the same goal, but varying with number of lights, supported shader model, etc.

So when i/ people make another shader/ effect, do you then adapt all lighting stuff in one effect to the other to keep lighting correct?
(probably to easy to put it this way, but that's on purpose, because i think that 'simple' is positive instead of negative :))

#5 nfactorial   Members   -  Reputation: 717

Like
0Likes
Like

Posted 26 February 2013 - 02:52 PM

I think you need to back up a little and describe what it is you're trying to achieve, as it's unlikely that you need to do what you are asking.

 

Sure, you can render to a target, then pass that target back in to a new shader. But there's usually a specific reason you need to do that (lots of different reasons, but all specific).

 

For example, if you simply wish to have a shader calculate the lighting for one light. You do not need to access the frame buffer as an input to the second light shader, you can simply output your second lit geometry with an additive blend. This is called 'multi-pass' rendering.

 

For example, you can do (pseudo code):

SetBlendMode( Additive );
foreach ( light in lightList )
{
   SetShaderArgs( lightInfo );
   RenderGeometry( geometry );
}

 

There are other methods to do this, but to begin with this is the simplist way to start out. As you say, you will usually have multiple shaders that "vary with number of lights, supported shader model etc.". It is rare for these kind of shaders to require the information from a previous shader output (outside of the vertex shader->pixel shader pipeline) as they are usually self contained rendering instructions.

 

Times where you may need previous rendered output as new input are normally for more advanced effects such as refration/reflection, and other screen based effects (there are many reasons, but they are all a bit more advanced than simple lighting).

 

n!


Edited by nfactorial, 26 February 2013 - 02:54 PM.


#6 cozzie   Members   -  Reputation: 1446

Like
0Likes
Like

Posted 27 February 2013 - 01:54 PM

Thanks.
I'll give a bit more detail on what i try to achieve:

- i've made one effect/shader with x directional and y point lights.
X and Y based on the given scene and supported shader model (2 or 3). I have around 20 variances of this effect, the one applicable imuse to render all meshes with the given lightsources
- i want to create thenpossibility to create another effect which i apply to specific materials or mesh instances. And i wondered if there's a way to do that without having akk the lighting shader code in those specific effects (without losing the lighting effect)

Up till now i didn't use multipass rendering yet, because i thought that was to do more passes within one effect (i use d3dxeffects).

I'll get into multipass rendering theory to find out more about it.
Hope this helps in explaining what i'm trying to do.

#7 nfactorial   Members   -  Reputation: 717

Like
0Likes
Like

Posted 28 February 2013 - 01:47 AM

There a are a couple of approaches, however from your description it sounds like you should take a look at deferred rendering. Which separates the lighting calculation from the geometry rendering. It's a bit different from the normal forward rendering that it sounds like you are doing, but there is a lot of material on the net about it.

 

You can also perform multi-pass, as I mentioned, in a forward renderer.

 

n!



#8 cozzie   Members   -  Reputation: 1446

Like
0Likes
Like

Posted 28 February 2013 - 02:19 AM

Thanks.

Do you mean with multi-pass rendering:

1 - multiple passes which come from several effects (effect files), or

2 - should I create the final effect file within my code with multiple passes?

(based on the needs)

 

1 might sound easier but when I look at multipass-rendering documentation so far, this is mostly about method 2 (within one total effect).



#9 C0lumbo   Crossbones+   -  Reputation: 2119

Like
0Likes
Like

Posted 28 February 2013 - 04:26 AM

This does make me curious though how game studios/ developers manage this. Hearing that games use 'numerous' shaders. I can understand this from the aspect that i now also have about 20 with the same goal, but varying with number of lights, supported shader model, etc.

 

It's a very big challenge and there's a few common approaches:

 

1. Shader fragment solution: All the little bits of logic that make up a complex shader are cut up into little slices and concatenated together at runtime into a shader according to the needs of the lighting setup and the material.

2. Uber shader solution: One great big shader is written, with tonnes of ifdefs littered everyone and it's compiled into a shader at runtime according to the needs of the lighting setup and the material. This can also be done with branches instead of ifdefs, especially if the branches are based on shader constants and you're targetiing PC hardware this is simpler and pretty efficient.

3. Deferred lighting: One of the big reasons deferred lighting is so popular is that as well as being really efficient it solves the combinatorial shader explosion problem by making your shader count equal (lighting permutations + material permutations) instead of (lighting permutations * material permutations).

 

In my personal project I'm currently using uber shaders, but I'm trying to limit the combinatorial explosion problem by using lots of different ones each relatively focused on a single task. e.g. I have one 'ubershader' for 2D, one for skinned characters, one for terrain, one for particles. Because I'm specialising them this way I feel free to make lots of assumptions which reduce the permutations each one has. e.g. Skinned characters never have vertex colours and always have normals. I find doing it this way makes it easier to edit and optimise a given shader without worrying about a giant house of cards collapsing like with a classic uber shader or shader fragment solution.



#10 cozzie   Members   -  Reputation: 1446

Like
0Likes
Like

Posted 28 February 2013 - 02:08 PM

Hi columbo,

Thanks for the explanaton of the different approaches.

I can understand why you're going with your approach. One thing I'm wondering, how do you handle lighting over the differect effects for skinned characters, terrain, particles etc. Do you use deferred rendering like:

 

- skinned character shader = lighting permutations + skinned character stuff?

- material shader = lighting permutations + material stuff?

etc.

 

If so, I understand why deferred rendering will definitely bring me also something (I assume you 'feed' all the effects the same lighting parameters from your code/engine).



#11 C0lumbo   Crossbones+   -  Reputation: 2119

Like
0Likes
Like

Posted 01 March 2013 - 03:22 AM

I'm still using forward rendering, largely because I'm targeting mobile GPUs. I'm keeping my permutations down largely by making assumptions in each of my shaders. e.g. My skinning shader currently has 20 or 30 max bone versions, 1, 2, 3 or 4 weight versions, 0, 1, 2, 3, 4 light versions and fog on or off versions making 80 permutations. The different permutations are created by compiling at load-time with different #defines and I have a class which manages selecting the appropriate shader program and making sure it has all the uniforms setup.

 

I assume that all the lights are point lights except the first light which I assume is either directional or point, I assume that my skins will have normals, but not vertex colours, I assume lighting is done per vertex, and that I always want specular lighting. It's the assumptions and limitations that make my game require 100s of shader permutations instead of thousands or millions. Which is fine for my little engine tailored around a couple of projects, but not a great solution for a larger endeavour.

 

In theory, I should avoid repeating lighting code in my skin shader and my terrain shader by having a set of utility functions which they can both call. In practise I haven't got around to that yet, and there's some copy-pasting currently going on.



#12 cozzie   Members   -  Reputation: 1446

Like
0Likes
Like

Posted 01 March 2013 - 12:09 PM

sounds like a pragmatical approach which works.

thanks for the advice/hints, i'll just continue as I am now, and when I hit 'barriers' I'll find the best way at that specific situation/challenge






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS