Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 02 Mar 2010
Offline Last Active Yesterday, 06:58 AM

Posts I've Made

In Topic: What is the relationship between area lights and reflections (SSR/local/global)

21 April 2016 - 02:18 AM


Ah, right I got confused with that "blurry reflection look" and forgot that this video is only about lights.  What a bozo I am!


I've never written SSR/cube reflection before -- it seems like you would have to turn off the lights before your SSR/cube reflection pass so you don't "double up" the reflections of the light, right?  It Otherwise you would have one reflection from the analytic/punctual light model and another reflection from your SSR/cube reflection pass.  Or is that not that big of a deal?


For cubemaps you turn off any direct contribution, correct.


Do you really ? I always thought you capture it n number of times to simulate light bounces ?

In Topic: What is the relationship between area lights and reflections (SSR/local/global)

19 April 2016 - 09:21 AM

Since area lights and punctual light sources (spot, directional, point) basically try to approximate a realtime reflection for one light source it's not wrong for you to think that they are one and the same because what they are trying to achieve is the same thing. However as "everything reflects" in some way or another you'd have to evaluate every single pixel in your scene as a sort of light source, which is kinda what you're doing in image based lighting. Capture the environment and evaluate the reflected light from/into all directions.

Which also has the issue that it can only be done efficiently in an offline preprocess so that leaves us (for realtime reflections) with either planar reflections or screen space techniques.

In Topic: BRDFs in shaders

17 February 2016 - 03:39 PM

E is the irradiance measured on a surface location (in your shading that would be the pixel you shade on your geometry).

E_L is irradiance measured not on a surface location but on a unit plane. The L subscript tells you already that it is irradiance corresponding to the light source.


edit: What you wrote is correct, although it's kind of confusing to think about shading a surface location as measuring it on a plane. This plane is kind of imaginary but perpendicular to the normal vector at that location, hence the N dot L term is born to saturate the amount of light depending on the angle the light hits this plane.


Here are a few quotes from RealTimeRendering:


- "The emission of a directional light source can be quantified by measuring power through a unit area surface perpendicular to L. This quantity, called irradiance, is equivalent to the sum of energies of the photons passing through the surface in one second".

Note: He's talking about E_L here, irradiance perpendicular to the light source L.


- "Although measuring irradiance at a plane perpendicular to L tells us how bright the light is in general, to compute its illumination on a surface, we need to measure irradiance at a plane parallel to that surface..."

Note: He goes on to talk about how the N dot L factor is derived...


On page 103 you can see that irradiance E is equal to the irradiance E_L times the cosine angle between the surface normal N and the light direction L.

E = E_L * cos_theta_i


looking at the equation in your original post it now makes sense because it now translates the brdf into:

f(l,v) = outgoing_radiance / irradiance

aka the ratio between outgoing light into a small set of directions (in this case in the direction of our sensor/eye, which is vector V) and incoming light to this surface location (or rather a plane perpendicular to the surface normal N)


so finally to translate this into actual hlsl code your very simple light equation could look like this:

float3 E_L = light_intensity * light_color;
float cos_theta_i = saturate(dot(N, -L)); // negate L because we go from surface to light
float3 E = E_L * cos_theta_i;

// We actually output outgoing radiance here but since this is a very simplified/approximated BRDF 
// we can set this equal since we assume that diffuse light is reflected the same in all directions
return E;

which is the lambertian shading / BRDF :)

In Topic: BRDFs in shaders

12 February 2016 - 08:28 AM


I'm looking for better (easier) explanations about this topic, rather than the book yoshi_t mentioned. Any idea?


okay let's see....


Irradiance is the quantity of energy that is measured on a surface location incoming from all directions (that is mostly shown in literature as the letter E)

Now you may be confused and say "But hey if irradiance (E) is measured on a single location what's E_L then aka the irradiance measured on a surface perpendicular to L".

The irradiance perpendicular to L (E_L) is the amount of energy passing through a unit sized plane (don't get confused by this it's just something to make the measuring easier). You can think of it as the amount of energy the light source itself emits. Think of a light bulb emitting light with some amount of intensity into a direction. That is your E_L.

Radiance on the other hand is basically the same as irradiance (also remember radiance can be incoming or outgoing energy!) but not from all directions but only a limited or focused amount (think of a camera lens focusing light into a small amount of directions, that is the solid angle).

In the equation above it shows outgoing radiance (L_o) which is light reflecting from your surface location into a certain amount of directions.


I hope that is somewhat easier to understand...if it's still a little too hard to grasp here's the short version:


1. Irradiance = light energy on a single location from all incoming directions

​    ​Radiance = light energy on a single location from a small set of directions (solid angle)

    Solid Angle = small set of directions in 3D space (think of a piece of cake)


2. Irradiance measured on a plane perpendicular to the light direction = light flowing through a unit sized plane (for measurement sake) to basically tell you how much energy the light is emitting/transmitting


3. Pretty sure if you've done anything that involves light or texture color you've made use of those equations (even if you didn't know).

Radiometry is just a way to mathematically or physically explain / define those things



The problem with radiometry is often that the "basics" are confusing since they are already based on simplification or approximations of more advanced equations.

Maybe try to keep going and see if it starts to make more sense going further...

For example later on when they explain how irradiance is obtained by summing up incoming radiance over all directions it made more sense to me

In Topic: How Do I Render A Dynamic Main Menu/Menus Through Orthographic Projections?

07 January 2016 - 04:54 AM

You can setup an orthographic camera so that everything appears to be in the same depth, then use textured 2D planes (alpha blended or tested) for your GUI elements. For interaction e.g. clicking on a GUI button you can use raycasting (check intersection between ray of your mouse position towards the button/plane)