So you probably thought my ideas had been swept under the carpet since I posted my last blog entry almost four months ago. Well, not at all! I'm actually making some good progress on my latest renderer, which you can already check out at my github page. There aren't too many features and it's not optimized yet, but I've been trying to get the underlying architecture right before going into feature creep, just so I can last as long as possible before the whole thing inevitably collapses under its own weight but there are already some juicy renders to check out, and it's certainly a colossal improvement (engineering-wise) over my previous renderer.

But this isn't what this entry is about. I recently came across a nice paper, which you can download here for free, which explains how to render physically based lens flares, in real time on conventional hardware. One section of the paper caught my eye, and I'll share the details with you in this entry. The concept is simple. You have probably noticed that when you look at bright objects, they tend to display intricate patterns of alternating light and darkness, called lens flares:

There are two features in this image - the bright star-like object, which I will refer to as "starburst", and the the round thingies which I call "ghosts" (following the paper's terminology). The ghosts didn't interest me, so we will just ignore them. But look at the starburst, does its shape evoke anything? No? Here's a little hint:

That's right, it's a Fourier power spectrum. But why? The answer lies in the concept of

*far-field diffraction*(also called

*Fraunhofer diffraction*), which states that when a beam of parallel waves is partially blocked by an aperture, the waves will diffract, and every point across the aperture becomes a wave emitter. This is a special case of the more general diffraction phenomenon:

And since light is a wave, this applies to our starburst. So what happens is that all those diffracted waves now interfere, that is, they add constructively and destructively. And it turns out that this is just what the Fourier transform does, and therefore, our observer (which is behind the aperture) doesn't see the parallel waves, but instead sees the Fourier transform of the aperture! And this is pretty cool, because we know an algorithm to efficiently compute Fourier transforms: the FFT.

Now you know that each wavelength of light corresponds to a different color, and so, we don't just have one Fourier transform, but we have infinitely many, one for each wavelength. Fortunately, it turns out that the relationship between two such transforms is linear, that is, the Fourier transform for, say, 400nm (blue) is equivalent to the Fourier transform for 550nm (green), but just a different size. So we only need to calculate the transform once, and then superpose scaled copies of this transform over the visible spectrum. The result is called the "chromatic Fourier transform", and guess what? It looks just like our starburst, and all we need to do is slap it on a point light or area light, and... that's it.

There are a few extra things to consider, but I've written up a little prototype. Here are a couple computer-generated raw starbursts (the starbursts you would see in practice are less colorful because they necessarily exclude all wavelengths not emitted by the light source, and so often appear just yellow, what you see below is the "baseline" starburst, with all visible wavelengths):

There is some visible ringing in the center, which is probably due to my hasty implementation and poor choice of parameters, but it was a proof-of-concept. Here is a monochromatic one:

As you can see there is no ringing anymore, so it's definitely coming from the way I'm handling colors in my prototype. But it's getting there!

The following is the result of using a circular aperture, with a bit of dirt (simulated by some random dots and lines) on a section of the disk:

You might have expected an Airy diffraction pattern, it is actually there, but because the starburst is so bright you can't discern the fringes - a normalized (not tone-mapped) version of this image clearly shows the pattern (in the center):

Of course, the renders above assume the entire lens is fully illuminated, which is not true in general, only specific parts of the lens will be illuminated, which means the starburst will appear quite different depending on what you are looking at (and how you are looking at it).

Just to give you an idea of how efficient this process is, each of the starbursts above was rendered in about 4 seconds, on 1 CPU core, on a poorly optimized FFT algorithm, at a resolution of 1024x1024 pixels. Of course the starbursts need not be that large in, say, a game, this was for demonstration purposes. On any decent GPU, the FFT can be completed in a matter of milliseconds, and so the aperture could be updated each frame (with, for instance, particles of dust, rain, or even blood stains) for improved realism.

If you have ever wanted to have cool lens flare effects in your game, I'd recommend checking out the paper. It even comes with code examples to generate your own apertures based on optical lens configurations, and while the paper itself doesn't go into too much detail, it's definitely worth taking a look at.

Who thinks an animated version of this would make a killer screensaver? I certainly do

EDIT: after tweaking my prototype, I think I've come a bit closer to "realistic" starbursts, though I'm really uncertain, you can't just tweak the code randomly until it works, I need to sit down and slowly work through the theory to get it right, but the results do seem more consistent with reality:

Very nice. Thanks for sharing.

PS: I also enjoy such debug-renderings (the normalized output).