Jump to content

  • Log In with Google      Sign In   
  • Create Account

#ActualLemmi

Posted 17 December 2012 - 01:12 PM

*Update* I fixed it. It was not at all related to anything in my shader, my attenuation is perfectly fine. I had my depth stencil set up all wrong! I found the right solution here: https://developer.nvidia.com/sites/default/files/akamai/gamedev/docs/6800_Leagues_Deferred_Shading.pdf at page 15, if anyone is having similar problems! Sorry for necroing. Thanks for all the help, you people!


Hi, sorry for vague title. I'm going by Catalin Zima's deferred renderer pretty heavily, so it's very similar. My problem is that my point light lighting doesn't fall off based on the attenuation like it should, it's a little hard to explain exactly what's wrong, so I frapsed it: http://youtu.be/1AY2xpmImgc

Upper left is color map, right of that is normal map and furthest to the right is depth map. The light map is the bottom left one, so look at that one.




Basically, they color things that are outside of the light radius. I strongly suspect there's something wrong about the projected texture coordinates. I've double checked so that all values that I send into the shaders actually get assigned, and I've looked through everything in pix and it seems to be fine. When I draw the sphere model that represents the point light, I scale the translation matrix with a (LightRadius, LightRadius, LightRadius) matrix.

I use additive blending mode for my lighting phase, and change rasterizer state depending on if I'm inside or not. I use a separate render target for my depth, I haven't bothered trying to use my depth stencil as a RT as I've seen some people do.

Here's how the shader looks:

Vertex shader:

cbuffer MatrixVertexBuffer
{
float4x4 World;
float4x4 View;
float4x4 Projection;
}
struct VertexShaderInput
{
float3 Position : POSITION0;
};
struct VertexShaderOutput
{
float4 Position : SV_Position;
float4 LightPosition : TEXCOORD0;
};
VertexShaderOutput LightVertexShader(VertexShaderInput input)
{
VertexShaderOutput output;
output.Position = mul(float4(input.Position, 1.0f), World);
output.Position = mul(output.Position, View);
output.Position = mul(output.Position, Projection);
output.LightPosition = output.Position;
return output;
}

Pixel shader:

cbuffer LightBufferType
{
float3 LightColor;
float3 LightPosition;
float LightRadius;
float LightPower;
float4 CameraPosition;
float4 Padding;
}
cbuffer PixelMatrixBufferType
{
float4x4 InvViewProjection;
}
//==Structs==
struct VertexShaderOutput
{
float4 Position : SV_Position;
float4 LightPosition : TEXCOORD0;
};
//==Variables==
Texture2D textures[3]; //Color, Normal, depth
SamplerState pointSampler;
//==Functions==
float2 postProjToScreen(float4 position)
{
  float2 screenPos = position.xy / position.w;
  return 0.5f * (float2(screenPos.x, -screenPos.y) + 1);
}
half4 LightPixelShader(VertexShaderOutput input) : SV_TARGET0
{
float2 texCoord = postProjToScreen(input.LightPosition);
float4 baseColor = textures[0].Sample(pointSampler, texCoord);
if(baseColor.r + baseColor.g + baseColor.b < 0.0f) //I cull early if the pixel is completely black, meaning there really isn't anything to light here.
{
  return half4(0.0f, 0.0f, 0.0f, 0.0f);
}
//get normal data from the normalMap
float4 normalData = textures[1].Sample(pointSampler, texCoord);

//tranform normal back into [-1,1] range
float3 normal = 2.0f * normalData.xyz - 1.0f;
//read depth
float depth = textures[2].Sample(pointSampler, texCoord);
//compute screen-space position
float4 position;
position.x = texCoord.x;
position.y = -(texCoord.x);
position.z = depth;
position.w = 1.0f;
//transform to world space
position = mul(position, InvViewProjection);
position /= position.w;
//surface-to-light vector
float3 lightVector = position - input.LightPosition;
//compute attenuation based on distance - linear attenuation
float attenuation = saturate(1.0f - max(0.01f, lightVector)/(LightRadius/2)); //max(0.01f, lightVector) to avoid divide by zero!
//normalize light vector
lightVector = normalize(lightVector);
//compute diffuse light
float NdL = max(0, dot(normal, lightVector));
float3 diffuseLight = NdL * LightColor.rgb;
//reflection vector
float3 reflectionVector = normalize(reflect(-lightVector, normal));
//camera-to-surface vector
float3 directionToCamera = normalize(CameraPosition - position);
//compute specular light
float specularLight =  pow( saturate(dot(reflectionVector, directionToCamera)), 128.0f);
//take into account attenuation and lightIntensity.
return attenuation * half4(diffuseLight.rgb, specularLight);
}

Sorry if it's messy, I'm pretty loose about standards and commenting while experimenting.

Thank you for your time, it's really appreciated.

#2Lemmi

Posted 08 December 2012 - 05:59 PM

Hi, sorry for vague title. I'm going by Catalin Zima's deferred renderer pretty heavily, so it's very similar. My problem is that my point light lighting doesn't fall off based on the attenuation like it should, it's a little hard to explain exactly what's wrong, so I frapsed it: http://youtu.be/1AY2xpmImgc

Upper left is color map, right of that is normal map and furthest to the right is depth map. The light map is the bottom left one, so look at that one.




Basically, they color things that are outside of the light radius. I strongly suspect there's something wrong about the projected texture coordinates. I've double checked so that all values that I send into the shaders actually get assigned, and I've looked through everything in pix and it seems to be fine. When I draw the sphere model that represents the point light, I scale the translation matrix with a (LightRadius, LightRadius, LightRadius) matrix.

I use additive blending mode for my lighting phase, and change rasterizer state depending on if I'm inside or not. I use a separate render target for my depth, I haven't bothered trying to use my depth stencil as a RT as I've seen some people do.

Here's how the shader looks:

Vertex shader:

cbuffer MatrixVertexBuffer
{
float4x4 World;
float4x4 View;
float4x4 Projection;
}
struct VertexShaderInput
{
float3 Position : POSITION0;
};
struct VertexShaderOutput
{
float4 Position : SV_Position;
float4 LightPosition : TEXCOORD0;
};
VertexShaderOutput LightVertexShader(VertexShaderInput input)
{
VertexShaderOutput output;
output.Position = mul(float4(input.Position, 1.0f), World);
output.Position = mul(output.Position, View);
output.Position = mul(output.Position, Projection);
output.LightPosition = output.Position;
return output;
}

Pixel shader:

cbuffer LightBufferType
{
float3 LightColor;
float3 LightPosition;
float LightRadius;
float LightPower;
float4 CameraPosition;
float4 Padding;
}
cbuffer PixelMatrixBufferType
{
float4x4 InvViewProjection;
}
//==Structs==
struct VertexShaderOutput
{
float4 Position : SV_Position;
float4 LightPosition : TEXCOORD0;
};
//==Variables==
Texture2D textures[3]; //Color, Normal, depth
SamplerState pointSampler;
//==Functions==
float2 postProjToScreen(float4 position)
{
  float2 screenPos = position.xy / position.w;
  return 0.5f * (float2(screenPos.x, -screenPos.y) + 1);
}
half4 LightPixelShader(VertexShaderOutput input) : SV_TARGET0
{
float2 texCoord = postProjToScreen(input.LightPosition);
float4 baseColor = textures[0].Sample(pointSampler, texCoord);
if(baseColor.r + baseColor.g + baseColor.b < 0.0f) //I cull early if the pixel is completely black, meaning there really isn't anything to light here.
{
  return half4(0.0f, 0.0f, 0.0f, 0.0f);
}
//get normal data from the normalMap
float4 normalData = textures[1].Sample(pointSampler, texCoord);

//tranform normal back into [-1,1] range
float3 normal = 2.0f * normalData.xyz - 1.0f;
//read depth
float depth = textures[2].Sample(pointSampler, texCoord);
//compute screen-space position
float4 position;
position.x = texCoord.x;
position.y = -(texCoord.x);
position.z = depth;
position.w = 1.0f;
//transform to world space
position = mul(position, InvViewProjection);
position /= position.w;
//surface-to-light vector
float3 lightVector = position - input.LightPosition;
//compute attenuation based on distance - linear attenuation
float attenuation = saturate(1.0f - max(0.01f, lightVector)/(LightRadius/2)); //max(0.01f, lightVector) to avoid divide by zero!
//normalize light vector
lightVector = normalize(lightVector);
//compute diffuse light
float NdL = max(0, dot(normal, lightVector));
float3 diffuseLight = NdL * LightColor.rgb;
//reflection vector
float3 reflectionVector = normalize(reflect(-lightVector, normal));
//camera-to-surface vector
float3 directionToCamera = normalize(CameraPosition - position);
//compute specular light
float specularLight =  pow( saturate(dot(reflectionVector, directionToCamera)), 128.0f);
//take into account attenuation and lightIntensity.
return attenuation * half4(diffuseLight.rgb, specularLight);
}

Sorry if it's messy, I'm pretty loose about standards and commenting while experimenting.

Thank you for your time, it's really appreciated.

#1Lemmi

Posted 08 December 2012 - 05:57 PM

Hi, sorry for vague title. I'm going by Catalin Zima's deferred renderer pretty heavily, so it's very similar. My problem is that my point light lighting doesn't fall off based on the attenuation like it should, it's a little hard to explain exactly what's wrong, so I frapsed it: http://youtu.be/1AY2xpmImgc

Basically, they color things that are outside of the light radius. I strongly suspect there's something wrong about the projected texture coordinates. I've double checked so that all values that I send into the shaders actually get assigned, and I've looked through everything in pix and it seems to be fine. When I draw the sphere model that represents the point light, I scale the translation matrix with a (LightRadius, LightRadius, LightRadius) matrix.

I use additive blending mode for my lighting phase, and change rasterizer state depending on if I'm inside or not. I use a separate render target for my depth, I haven't bothered trying to use my depth stencil as a RT as I've seen some people do.

Here's how the shader looks:

Vertex shader:

cbuffer MatrixVertexBuffer
{
float4x4 World;
float4x4 View;
float4x4 Projection;
}
struct VertexShaderInput
{
float3 Position : POSITION0;
};
struct VertexShaderOutput
{
float4 Position : SV_Position;
float4 LightPosition : TEXCOORD0;
};
VertexShaderOutput LightVertexShader(VertexShaderInput input)
{
VertexShaderOutput output;
output.Position = mul(float4(input.Position, 1.0f), World);
output.Position = mul(output.Position, View);
output.Position = mul(output.Position, Projection);
output.LightPosition = output.Position;
return output;
}

Pixel shader:

cbuffer LightBufferType
{
float3 LightColor;
float3 LightPosition;
float LightRadius;
float LightPower;
float4 CameraPosition;
float4 Padding;
}
cbuffer PixelMatrixBufferType
{
float4x4 InvViewProjection;
}
//==Structs==
struct VertexShaderOutput
{
float4 Position : SV_Position;
float4 LightPosition : TEXCOORD0;
};
//==Variables==
Texture2D textures[3]; //Color, Normal, depth
SamplerState pointSampler;
//==Functions==
float2 postProjToScreen(float4 position)
{
  float2 screenPos = position.xy / position.w;
  return 0.5f * (float2(screenPos.x, -screenPos.y) + 1);
}
half4 LightPixelShader(VertexShaderOutput input) : SV_TARGET0
{
float2 texCoord = postProjToScreen(input.LightPosition);
float4 baseColor = textures[0].Sample(pointSampler, texCoord);
if(baseColor.r + baseColor.g + baseColor.b < 0.0f) //I cull early if the pixel is completely black, meaning there really isn't anything to light here.
{
  return half4(0.0f, 0.0f, 0.0f, 0.0f);
}
//get normal data from the normalMap
float4 normalData = textures[1].Sample(pointSampler, texCoord);

//tranform normal back into [-1,1] range
float3 normal = 2.0f * normalData.xyz - 1.0f;
//read depth
float depth = textures[2].Sample(pointSampler, texCoord);
//compute screen-space position
float4 position;
position.x = texCoord.x;
position.y = -(texCoord.x);
position.z = depth;
position.w = 1.0f;
//transform to world space
position = mul(position, InvViewProjection);
position /= position.w;
//surface-to-light vector
float3 lightVector = position - input.LightPosition;
//compute attenuation based on distance - linear attenuation
float attenuation = saturate(1.0f - max(0.01f, lightVector)/(LightRadius/2)); //max(0.01f, lightVector) to avoid divide by zero!
//normalize light vector
lightVector = normalize(lightVector);
//compute diffuse light
float NdL = max(0, dot(normal, lightVector));
float3 diffuseLight = NdL * LightColor.rgb;
//reflection vector
float3 reflectionVector = normalize(reflect(-lightVector, normal));
//camera-to-surface vector
float3 directionToCamera = normalize(CameraPosition - position);
//compute specular light
float specularLight =  pow( saturate(dot(reflectionVector, directionToCamera)), 128.0f);
//take into account attenuation and lightIntensity.
return attenuation * half4(diffuseLight.rgb, specularLight);
}

Sorry if it's messy, I'm pretty loose about standards and commenting while experimenting.

Thank you for your time, it's really appreciated.

PARTNERS