Jump to content
  • Advertisement
Sign in to follow this  
kiwijus

Volumetric Light Scattering Post Process

This topic is 2594 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've been attempting to implement the Volumetric Light Scattering technique as described on the GPU Gems Website.

I render the scene with the light as a white sphere and the other objects in black.

I then use the following shader:


int NUM_SAMPLES = 60;
int Density = 1;
int Weight = 1;
int Decay = 1;
float4 o = float4( 1.0f, 1.0f, 1.0f, 1 );

float4 lightSourcePos = float4( 0,0,30, 0);
float4 lightPos = mul(lightSourcePos, World );
lightPos = mul(lightPos, View );
lightPos = mul(lightPos, Projection);
float2 screenLightPos = lightPos.xy/ lightPos.w;
screenLightPos = normalize( screenLightPos );
screenLightPos = screenLightPos * 0.5f - 0.5f;
screenLightPos.y = 1.0f - screenLightPos.y;

half2 deltaTexCoord = (input.t - screenLightPos.xy);
deltaTexCoord *= 1.0f / NUM_SAMPLES * Density;

half3 color = Light.Sample(linearSampler, input.t).xyz;

half illuminationDecay = 1.0f;

for (int i = 0; i < NUM_SAMPLES; i++)
{
input.t -= deltaTexCoord;
half3 sample = Light.Sample(linearSampler, input.t).xyz;
sample *= illuminationDecay * Weight;
color += sample;
illuminationDecay *= Decay;
}

o = float4( color, 1 );



The first screenshot shows the sphere with the occluded object and the second screenshot shows the regular colour buffer rendered with the above shader. From the looks of it I'm guessing there's something wrong with either the texture coordinates or the light coordinates, but I can't quite figure it out.

I've had a search through the forums to see if any of the existing posts will help, but none of them have really helped.

Thanks,
Dave

Share this post


Link to post
Share on other sites
Advertisement
Don't normalize your screenLightPos. Normalization assures that the vector always has a pythagoras based length of 1 from (0,0), that's not whatyou want. In fact you don't even want any normalization at all. You just need to write an additional implementation that addresses what happens when the sun is behind you. The sun shaft would look like they in front of certain objects, even when you know that the light shafts should be behind them. So maybe incorporating the depth buffer would be a good solution. And also try to calculate the texture coordinates of the sun in the vertex shader.

And that's about it ;)

Share this post


Link to post
Share on other sites
Thanks for the reply!

I'm not sure why I normalized it really! Don't know what I was thinking hah!

So I'm guessing you use the depth buffer to test if the light source is behind the objects or not?
Do the calculations I'm currently using for the texture coordinates look correct? They need to be in the [0,1] range right?

Thanks,
Dave

Share this post


Link to post
Share on other sites
Okay, I seem to have fixed it, I was doing my translation and scaling transformations in the wrong order and my light was inside my object!

I do still have a slight problem though! When rendering with the normal G-Buffer I get a nice godray effect, however when rendering with the black occluded object it seems to be just like slices! I can't figure out why :(

Thanks,
Dave

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!