Jump to content
  • Advertisement
Sign in to follow this  
Max_Payne

Distributed Photon Mapping : I was Right

This topic is 4727 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Well, my renderer is far from done... But I have done some experiments that seem to prove I was right. I made a thread about distributed photon mapping recently. I was asking if by averaging multiple passes at lower photon densities, we could obtain a higher image quality with reduced noise, that would converge towards the right solution. Well, I programmed it, and here is what I obtained. This is direct photon map visualisation. I will eventually improve the system by using the photon map only for indirect illumination.
After 1 pass:
After 17 passes:
ApochPiQ said that by averaging, the image would end up losing its features and diverging. However, as you can see, it isn't the case. In fact, some image features are becoming clearer (there is a noticeable shadow over the sphere after 17 passes).

Share this post


Link to post
Share on other sites
Advertisement
It looks quite nice but you didn't say how long each pass took, and what it would look like doing a single pass with 17X as many photons...

Share this post


Link to post
Share on other sites
Quote:
Original post by d000hg
It looks quite nice but you didn't say how long each pass took


Too long. About 30 minutes per pass. The quality is still insatisfying since its direct visualisation of the photon map.

Quote:
and what it would look like doing a single pass with 17X as many photons...


Each pass is rendered using a little over 3 million photons. It takes 40 MB of RAM per million photon, times two, since I need twice that to build the KD-tree (photon heap), so 80 MB per million photon. The thing is. I have 2 GBs of RAM, and that's not enough for 17 times as many photons (I would need over 4 GBs). Thats the whole point of this multi-pass approach.

Share this post


Link to post
Share on other sites
Wow, looks really good! Out of curiosity, is it plausible to do the distributed rendering over the internet?

(note: I have about zero experience with photon mapping/raytracers)

Share this post


Link to post
Share on other sites
You'll remember that I said it will look like it works in certain cases. This very simple case is one of them. Try it with a high-frequency color bleeding pattern like the checkered-wall variant of the Cornell box, or with a high-detail caustic. Also, run a side-by-side comparison with a high-photon render. They'll be different.

Share this post


Link to post
Share on other sites
Quote:
Original post by Cypher19
Wow, looks really good! Out of curiosity, is it plausible to do the distributed rendering over the internet?

(note: I have about zero experience with photon mapping/raytracers)


The whole point is to do it over the internet ;) Each pass can be performed on a different node, allowing to distribute the load of the rendering process.

Quote:
You'll remember that I said it will look like it works in certain cases. This very simple case is one of them. Try it with a high-frequency color bleeding pattern like the checkered-wall variant of the Cornell box, or with a high-detail caustic. Also, run a side-by-side comparison with a high-photon render. They'll be different.


Different? Possible. The point is, its converging. If it converges here, then I would say that applying the same averaging technique to any scene will work, as long as each rendering node uses the same rendering process.

Right now my concern is to eliminate those nasty borders at wall/wall and wall/ceiling intersections... Then I will work on separating the direct illumination, and use the photon map for indirect illumination only.

Share this post


Link to post
Share on other sites
The fact that it converges for a cornell box with one sphere in it has absolutely nothing to do with whether it will converge for other situations.

ApochPiQ is right. This example has a very low-frequency irradiance function. Because it is low frequency, averaging several samplings will give a good approximation to the true function.

But try the examples that ApochPiQ suggested and you will find that your method oblitearates all the high-frequency detail.

Share this post


Link to post
Share on other sites
Quote:
Original post by Max_Payne
Right now my concern is to eliminate those nasty borders at wall/wall and wall/ceiling intersections... Then I will work on separating the direct illumination, and use the photon map for indirect illumination only.


Those "nasty" borders? They are actually correct. What you are seeing is diffuse interreflection as color from one wall bleeds onto the adjacent wall/ceiling at the edge. You WANT that behavior, if you need convincing, just look at any corner in a real room, and you'll see the real life effect.

Share this post


Link to post
Share on other sites
Quote:
The whole point is to do it over the internet ;) Each pass can be performed on a different node, allowing to distribute the load of the rendering process.


Sorry, should have clarified. I know the whole point behind distributed rendering is to do it over intra/internet, but I was asking if the amount of data that needs to be sent allows for a benefit from doing it over the internet (i.e. you uploading a client exe, we all let it run and then you (remotely) tell the exe to compute the algorithm for a certain clump of photons)

Share this post


Link to post
Share on other sites
VERY nice. Your image is really impressive, but even better is the way you're experimenting to try find better ways to do things.

Like some have already said, your experiments haven't really proved that all cases are convergent, but rather demonstrate that cases exist in which convergence does occur.

But I'm strongly inclined to believe that you're right. Why not try the technique on some of difficult cases, as ApochPiQ suggests. If it works, then your technique introduces a host of really interesting possibilities.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!