# Does spherical harmonics always need a high amount of samples?

This topic is 662 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I'm doing a fast research about Spherical Harmonics to see if I can try to use it in real time for some visual effects. However, the little friend seems like a festival of equations and math theories that I need to have some idea in order to not dedicate too much time making it work. I'm wondering if it is possible to get reasonable spherical harmonics (only a rough sense of colors of light, at a very low frequency) with a very low amount of sampling (i.e 27 samples), since some of the material I've found relies on Monte Carlo. I'd estimate I'd need to calculate about 300 of these per frame refresh (at worst)

Since I'm a "gamedev at free time", I'd like to hear from someone more experienced if this amount of processing is feasible. If not, it'd save me some time and I'd be able to search for alternatives.

Footnote: I'm weak at graphics board programming, so they'd be mostly like calculated on cpu.

##### Share on other sites

What exactly are you sampling in this case? Is it a cubemap? A scene full of arbitrary geometry? Light sources? Some of these things are much more expensive to sample than others.

##### Share on other sites

It's only a rough idea, actually.

I'm thinking in setting up a 3d grid of cubic areas (cubes are easy to handle), with each cube with a specific light value, much like minecraft's light propagation algorithm. Then, each cube might have a color modifier. A spherical harmonic would be set into each cube and map the 26 adjacents cubes light intensity and color modifier in order to make a simple surrounding light mapping. Then I'd apply this cube over deferred rendering to illuminate visible geometry.

Now thinking about it, there seems to be no way to interpolate the SH in deferred rendering...

##### Share on other sites

Researched and abandoned after discovering there's too much lightleak and too short of a range for realistic rendering, but something minecraft like, such as minecraft itself, can have it work perfectly fine. Especially if you're not doing something that varies over time as much. I think Minecraft just does a single term skylight approximation yes? If you did something like that/ambient occlusion alike then it could work pretty well.

Edited by Frenetic Pony

##### Share on other sites
the amount of samples you need depends on the degree of spherical harmonic you want. But your light is already stored in a regular grid of colors, you could actually store it in a (spares) 3d volume texture and sample from it. The texture units would do the interpolation and you'd avoid having a quality loss due to sampling and conversion (e.g. low degree SH might result in some 'over darkening' on the opposite of bright spots ).

##### Share on other sites

Ahh, that would be (essentially/very close to) light propagation volumes

This seems... tricky. Looks like a lot of stuff to learn before trying it out. I know that minecraft uses a "floodfill-like" algorithm that works quite fast for torches and other light sources, though I don't know how they do to interpolate the results.

But your light is already stored in a regular grid of colors, you could actually store it in a (spares) 3d volume texture and sample from it. The texture units would do the interpolation and you'd avoid having a quality loss due to sampling and conversion

This would mean I'd lose all information about light directions, wouldn't it? What attracted me in SH is that they can be used as "light sources" for diffuse shaders (or so I believe). It'd allow me to try some effects like some big white flash from a nearby world position, while still trying to look like an ultimately simplified radiosity around simple world geometry.

##### Share on other sites

Ahh, that would be (essentially/very close to) light propagation volumes

This seems... tricky. Looks like a lot of stuff to learn before trying it out. I know that minecraft uses a "floodfill-like" algorithm that works quite fast for torches and other light sources, though I don't know how they do to interpolate the results.

But your light is already stored in a regular grid of colors, you could actually store it in a (spares) 3d volume texture and sample from it. The texture units would do the interpolation and you'd avoid having a quality loss due to sampling and conversion

This would mean I'd lose all information about light directions, wouldn't it? What attracted me in SH is that they can be used as "light sources" for diffuse shaders (or so I believe). It'd allow me to try some effects like some big white flash from a nearby world position, while still trying to look like an ultimately simplified radiosity around simple world geometry.

Like I said, it depends on what you want to do with it. Doesn't sound like you want bounce light, so you can skip the entire reflective shadow map part. And gathering direct light is fairly simple, calculating a direct analytic light contribution to a set of samples is as fast as any normal analytic light, you just store it in the SH grid after you're done. It's so fast UE4 uses it for lighting transparencies, recalculated each frame, which it doesn't sound like you want to do.

A fake bounce light from a torch is equally easy, just have an ambient term with the same falloff as your direct term.

Edited by Frenetic Pony

##### Share on other sites

But your light is already stored in a regular grid of colors, you could actually store it in a (spares) 3d volume texture and sample from it. The texture units would do the interpolation and you'd avoid having a quality loss due to sampling and conversion

This would mean I'd lose all information about light directions, wouldn't it?
not if you save the cpu data in a texture.

In the most simple way, you'd read these 6 colors and calculate a weighted light intensity, check out "Ambient Cube" in http://www.valvesoftware.com/publications/2004/GDC2004_Half-Life2_Shading.pdf
One step further, you'd place your colors as 3 axis (pixel pair) in the texture, +-x, +-y, +-z and let the GPU interpolate each axis with 3 fetches.

##### Share on other sites

Like I said, it depends on what you want to do with it. Doesn't sound like you want bounce light, so you can skip the entire reflective shadow map part. And gathering direct light is fairly simple, calculating a direct analytic light contribution to a set of samples is as fast as any normal analytic light, you just store it in the SH grid after you're done. It's so fast UE4 uses it for lighting transparencies, recalculated each frame, which it doesn't sound like you want to do.

I think I'll need to study a bit more in order to understand this, but it seems like a good starting point. It might be better if I try coding a simple SH and  see how it feels.

In the most simple way, you'd read these 6 colors and calculate a weighted light intensity, check out "Ambient Cube" in http://www.valvesoftware.com/publications/2004/GDC2004_Half-Life2_Shading.pdf One step further, you'd place your colors as 3 axis (pixel pair) in the texture, +-x, +-y, +-z and let the GPU interpolate each axis with 3 fetches.

Thats an interesting idea! It's also something I should try making a simple version to see how it feels like.

Thanks for the comments! I'll need some time to test a bit and let the ideas sink in, but I'll sure try them!

1. 1
2. 2
3. 3
Rutin
25
4. 4
5. 5
khawk
14

• 11
• 11
• 23
• 10
• 9
• ### Forum Statistics

• Total Topics
633648
• Total Posts
3013113
×