Jump to content

  • Log In with Google      Sign In   
  • Create Account

#ActualMJP

Posted 14 July 2013 - 09:19 PM

Yeah  that's the gist of it: you store ambient diffuse lighting encoded in some basis at sample locations, and interpolate between the samples based on the position of the object sampling the probes as well as the structured of the probes themselves (grid, loose points, etc.). Some common based used for encoding diffuse lighting:

  • Single value - store a single diffuse color, use it for all normal directions. Makes objects look very flat in ambient lighting, since the ambient diffuse doesn't change with normal direction. There's no way to actually compute diffuse lighting without a surface normal, so typically it will be the average of the computed diffuse in multiple directions.
  • Hemispherical - basically you compute diffuse lighting for a normal pointed straight up, and one for a normal pointed straight down. Then you interpolate between the two value using the Y component of the surface normal used for rendering.
  • Ambient cube - this is what Valve used in HL2. Similar to hemispherical, except diffuse lighting is computed and stored in 6 directions (usually aligned to world space axes). The contribution for each light is determined by taking the dot product of the surface normal with the direction for each axis of the cube.
  • Spherical harmonics - very commonly used in modern games. Basically you can store a low-frequency version of any spherical signal by projecting onto the SH basis functions up to a certain order, with the order determining how much detail will be retained as well as the number of coefficients that you need to store. SH has all kinds of useful properties, for instance they're essentially a frequency-domain representation of your signal which means you can perform convolutions with a simple multiplication. Typically this is used to convolve the lighting environment with a cosine lobe, which essentially gives you Lambertian diffuse. However you can also convolve with different kernels, which allows you to use SH with non-Lambertian BRDF's as well. You can even do specular BRDF's, however the low-frequency nature of SH typically limits you to very high roughnesses (low specular power, for Phong/Blinn-Phong BRDF's).
  • Spherical Radial Basis Functions - with these you basically approximate the full lighting environment surrounding a probe point with a set number of lobes (usually Gaussian) oriented at arbitrary directions about a sphere. These can be cool because they can let you potentially capture high-frequency lighting. However they're also difficult because you have to use a non-linear solver to "fit" a set of lobes to a lighting environment. You can also have issues with interpolation, since each probe can potentially have arbitrary lobe directions.
  • Cube Maps - this isn't common, but it's possible to integrate the irradiance for a set of cubemap texels where each texel represents a surface normal of a certain direction. This makes your shader code for evaluating lighting very simple: you just lookup into the cubemap based on the surface normal. However it's generally overkill, since something like SH or Ambient Cube can store diffuse lighting with relatively little error while having a very compact representation. Plus you don't have to mess around with binding an array of cube map textures, or sampling from them.

For all of these (with the exception of SRBF's) you can generate the probes by either ray-tracing and directly projecting onto the basis, or by rasterizing to a cubemap first and then projecting. This can potentially be very quick, in fact you could do it in real time for a limited number of probe locations. SRBF's are trickier, because of the non-linear solve which is typically an iterative process.

EDIT: I looked at those links posted, and there's two things I'd like to point out. In that GPU Gems article they evalute the diffuse from the SH lighting environment by pre-computing the set of "lookup" SH coefficients to a cube map lookup texture, but this is totally unnecessary since you can't just directly compute these coefficients in the shader. Something like this should work:

// 'lightingSH' is the lighting environment projected onto SH (3rd order in this case),
// and 'n' is the surface normal
float3 ProjectOntoSH9(in float3 lightingSH[9], in float3 n)
{
    float3 result = 0.0f;
    
    // Cosine kernel
    const float A0 = 1.0f;
    const float A1 = 2.0f / 3.0f;
    const float A2 = 0.25f;

    // Band 0
    result += lightingSH[0] * 0.282095f * A0;

    // Band 1
    result += lightingSH[1] * 0.488603f * n.y * A1;
    result += lightingSH[2] * 0.488603f * n.z * A1;
    result += lightingSH[3] * 0.488603f * n.x * A1;

    // Band 2
    result += lightingSH[4] * 1.092548f * n.x * n.y * A2;
    result += lightingSH[5] * 1.092548f * n.y * n.z * A2;
    result += lightingSH[6] * 0.315392f * (3.0f * n.z * n.z - 1.0f) * A2;
    result += lightingSH[7] * 1.092548f * n.x * n.z * A2;
    result += lightingSH[8] * 0.546274f * (n.x * n.x - n.y * n.y) * A2;

    return result;
}

This brings me to my second point, which is that the WebGL article mentions the lookup texture as a disadvantage of SH which really isn't valid since you don't need it at all. This makes SH a much more attractive option for storing irradiance, especially if your goal is runtime generation of irradiance maps since with SH you don't need an expensive convolution step. Instead your projection onto SH is basically a repeated downsampling process, which can be done very quickly. This is especially true if you use compute shaders, since you can use a parallel reduction to perform integration using shared memory with fewer steps.

 

For an introduction to using SH for this purpose, I would definitely recommend reading Ravi's 2001 paper on the subject. Robin Greene's paper is also a good place to start.


#2MJP

Posted 14 July 2013 - 09:17 PM

Yeah  that's the gist of it: you store ambient diffuse lighting encoded in some basis at sample locations, and interpolate between the samples based on the position of the object sampling the probes as well as the structured of the probes themselves (grid, loose points, etc.). Some common based used for encoding diffuse lighting:

  • Single value - store a single diffuse color, use it for all normal directions. Makes objects look very flat in ambient lighting, since the ambient diffuse doesn't change with normal direction. There's no way to actually compute diffuse lighting without a surface normal, so typically it will be the average of the computed diffuse in multiple directions.
  • Hemispherical - basically you compute diffuse lighting for a normal pointed straight up, and one for a normal pointed straight down. Then you interpolate between the two value using the Y component of the surface normal used for rendering.
  • Ambient cube - this is what Valve used in HL2. Similar to hemispherical, except diffuse lighting is computed and stored in 6 directions (usually aligned to world space axes). The contribution for each light is determined by taking the dot product of the surface normal with the direction for each axis of the cube.
  • Spherical harmonics - very commonly used in modern games. Basically you can store a low-frequency version of any spherical signal by projecting onto the SH basis functions up to a certain order, with the order determining how much detail will be retained as well as the number of coefficients that you need to store. SH has all kinds of useful properties, for instance they're essentially a frequency-domain representation of your signal which means you can perform convolutions with a simple multiplication. Typically this is used to convolve the lighting environment with a cosine lobe, which essentially gives you Lambertian diffuse. However you can also convolve with different kernels, which allows you to use SH with non-Lambertian BRDF's as well. You can even do specular BRDF's, however the low-frequency nature of SH typically limits you to very high roughnesses (low specular power, for Phong/Blinn-Phong BRDF's).
  • Spherical Radial Basis Functions - with these you basically approximate the full lighting environment surrounding a probe point with a set number of lobes (usually Gaussian) oriented at arbitrary directions about a sphere. These can be cool because they can let you potentially capture high-frequency lighting. However they're also difficult because you have to use a non-linear solver to "fit" a set of lobes to a lighting environment. You can also have issues with interpolation, since each probe can potentially have arbitrary lobe directions.
  • Cube Maps - this isn't common, but it's possible to integrate the irradiance for a set of cubemap texels where each texel represents a surface normal of a certain direction. This makes your shader code for evaluating lighting very simple: you just lookup into the cubemap based on the surface normal. However it's generally overkill, since something like SH or Ambient Cube can store diffuse lighting with relatively little error while having a very compact representation. Plus you don't have to mess around with binding an array of cube map textures, or sampling from them.

For all of these (with the exception of SRBF's) you can generate the probes by either ray-tracing and directly projecting onto the basis, or by rasterizing to a cubemap first and then projecting. This can potentially be very quick, in fact you could do it in real time for a limited number of probe locations. SRBF's are trickier, because of the non-linear solve which is typically an iterative process.

EDIT: I looked at those links posted, and there's two things I'd like to point out. In that GPU Gems article they evalute the diffuse from the SH lighting environment by pre-computing the set of "lookup" SH coefficients to a cube map lookup texture, but this is totally unnecessary since you can't just directly compute these coefficients in the shader. Something like this should work:

 

// 'lightingSH' is the lighting environment projected onto SH (3rd order in this case),
// and 'n' is the surface normal
float3 ProjectOntoSH9(in float3 lightingSH[9], in float3 n)
{
    float3 result = 0.0f;
    
    // Cosine kernel
    const float A0 = 1.0f;
    const float A1 = 2.0f / 3.0f;
    const float A2 = 0.25f;

    // Band 0
    result += lightingSH[0] * 0.282095f * A0;

    // Band 1
    result += lightingSH[1] * 0.488603f * n.y * A1;
    result += lightingSH[2] * 0.488603f * n.z * A1;
    result += lightingSH[3] * 0.488603f * n.x * A1;

    // Band 2
    result += lightingSH[4] * 1.092548f * n.x * n.y * A2;
    result += lightingSH[5] * 1.092548f * n.y * n.z * A2;
    result += lightingSH[6] * 0.315392f * (3.0f * n.z * n.z - 1.0f) * A2;
    result += lightingSH[7] * 1.092548f * n.x * n.z * A2;
    result += lightingSH[8] * 0.546274f * (n.x * n.x - n.y * n.y) * A2;

    return result;
}

 

This brings me to my second point, which is that the WebGL article mentions the lookup texture as a disadvantage of SH which really isn't valid since you don't need it at all. This makes SH a much more attractive option for storing irradiance, especially if your goal is runtime generation of irradiance maps since with SH you don't need an expensive convolution step. Instead your projection onto SH is basically a repeated downsampling process, which can be done very quickly. This is especially true if you use compute shaders, since you can use a parallel reduction to perform integration using shared memory with fewer steps.


#1MJP

Posted 14 July 2013 - 08:57 PM

Yeah  that's the gist of it: you store ambient diffuse lighting encoded in some basis at sample locations, and interpolate between the samples based on the position of the object sampling the probes as well as the structured of the probes themselves (grid, loose points, etc.). Some common based used for encoding diffuse lighting:

  • Single value - store a single diffuse color, use it for all normal directions. Makes objects look very flat in ambient lighting, since the ambient diffuse doesn't change with normal direction. There's no way to actually compute diffuse lighting without a surface normal, so typically it will be the average of the computed diffuse in multiple directions.
  • Hemispherical - basically you compute diffuse lighting for a normal pointed straight up, and one for a normal pointed straight down. Then you interpolate between the two value using the Y component of the surface normal used for rendering.
  • Ambient cube - this is what Valve used in HL2. Similar to hemispherical, except diffuse lighting is computed and stored in 6 directions (usually aligned to world space axes). The contribution for each light is determined by taking the dot product of the surface normal with the direction for each axis of the cube.
  • Spherical harmonics - very commonly used in modern games. Basically you can store a low-frequency version of any spherical signal by projecting onto the SH basis functions up to a certain order, with the order determining how much detail will be retained as well as the number of coefficients that you need to store. SH has all kinds of useful properties, for instance they're essentially a frequency-domain representation of your signal which means you can perform convolutions with a simple multiplication. Typically this is used to convolve the lighting environment with a cosine lobe, which essentially gives you Lambertian diffuse. However you can also convolve with different kernels, which allows you to use SH with non-Lambertian BRDF's as well. You can even do specular BRDF's, however the low-frequency nature of SH typically limits you to very high roughnesses (low specular power, for Phong/Blinn-Phong BRDF's).
  • Spherical Radial Basis Functions - with these you basically approximate the full lighting environment surrounding a probe point with a set number of lobes (usually Gaussian) oriented at arbitrary directions about a sphere. These can be cool because they can let you potentially capture high-frequency lighting. However they're also difficult because you have to use a non-linear solver to "fit" a set of lobes to a lighting environment. You can also have issues with interpolation, since each probe can potentially have arbitrary lobe directions.

For all of these (with the exception of SRBF's) you can generate the probes by either ray-tracing and directly projecting onto the basis, or by rasterizing to a cubemap first and then projecting. This can potentially be very quick, in fact you could do it in real time for a limited number of probe locations. SRBF's are trickier, because of the non-linear solve which is typically an iterative process.


PARTNERS