Jump to content

  • Log In with Google      Sign In   
  • Create Account


Arauna2 path tracer announcement


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
8 replies to this topic

#1 phantomus   Members   -  Reputation: 593

Like
11Likes
Like

Posted 12 November 2013 - 09:41 AM

Hi all,

 

I posted a demo of the new Arauna2 renderer on ompf2.com:

http://ompf2.com/viewtopic.php?f=5&t=1887

 

A youtube of the demo can be found here:

 

A higher quality version of the video can be downloaded as well (warning: large; 850Mb):

https://mega.co.nz/#!pgJQUJ4Z!XaJIW0B_BMha39FIk8TsOk2CV2RTw5wRPEpDbrVeNBk

 

And finally, the demo, which requires a decent CUDA device and 64-bit windows:

https://mega.co.nz/#!UphmTTxb!JAfhjCfDa3-zDo3Zy5gD9EArYxlkJKa10XPqZY8sZRs

 

Arauna2 is an interactive path tracer, which produces images using a random process. As a result, images contain noise, which will fade away as the pixel values approach the 'expected value'. Arauna2 has been optimized to produce decent quality images very quickly, and converges to high quality images in seconds. It is intended for architectural walkthroughs, but something involving games would be even more awesome. As you can see in the demo, we are not there yet, sadly.

 

I believe Arauna2 is currently the fastest path tracer, but that is of course strongly dependent on requirements. Main limitation is the fixed shading path, which supports the full Phong model, specular, dielectrics, textures, normal mapping and emissive surfaces, as well as every sensible combination of those. Illumination is provided using light emitting surfaces, meshes, point lights, spotlights (including IES profile support) and directional lights.

 

- Jacco.



Sponsor:

#2 fir   Members   -  Reputation: 251

Like
1Likes
Like

Posted 12 November 2013 - 11:03 AM

Hi all,

 

I posted a demo of the new Arauna2 renderer on ompf2.com:

http://ompf2.com/viewtopic.php?f=5&t=1887

 

A youtube of the demo can be found here:

 

A higher quality version of the video can be downloaded as well (warning: large; 850Mb):

https://mega.co.nz/#!pgJQUJ4Z!XaJIW0B_BMha39FIk8TsOk2CV2RTw5wRPEpDbrVeNBk

 

And finally, the demo, which requires a decent CUDA device and 64-bit windows:

https://mega.co.nz/#!UphmTTxb!JAfhjCfDa3-zDo3Zy5gD9EArYxlkJKa10XPqZY8sZRs

 

Arauna2 is an interactive path tracer, which produces images using a random process. As a result, images contain noise, which will fade away as the pixel values approach the 'expected value'. Arauna2 has been optimized to produce decent quality images very quickly, and converges to high quality images in seconds. It is intended for architectural walkthroughs, but something involving games would be even more awesome. As you can see in the demo, we are not there yet, sadly.

 

I believe Arauna2 is currently the fastest path tracer, but that is of course strongly dependent on requirements. Main limitation is the fixed shading path, which supports the full Phong model, specular, dielectrics, textures, normal mapping and emissive surfaces, as well as every sensible combination of those. Illumination is provided using light emitting surfaces, meshes, point lights, spotlights (including IES profile support) and directional lights.

 

- Jacco.

 

last weaks i found myself interested in raytracing and about a week ago incidentally i got know about aruna realtime pathtracer (though i cannot run it on my terribly old machine)

I am fan of it - great work, (Besides i hope that home computers will be 2, 5, 10 times faster in succesive years  

though the 10x factor sadly I think can take many years  :CC



#3 gboxentertainment   Members   -  Reputation: 753

Like
3Likes
Like

Posted 13 November 2013 - 04:00 AM

Have you or will you plan to implement any type of noise filtering? e.g. random parameter filtering looks interesting: http://www.youtube.com/watch?v=Ee51bkOlbMw

However, Sam Lapere, who's working on the Brigade engine, i'm pretty sure said that rpf doesn't really provide good results, but I would like to see some proof of that.

I think you worked on the Brigade engine as well didn't you?


Edited by gboxentertainment, 13 November 2013 - 04:04 AM.


#4 phantomus   Members   -  Reputation: 593

Like
2Likes
Like

Posted 13 November 2013 - 04:35 AM

Have you or will you plan to implement any type of noise filtering?

 

Filtering is being researched at the moment (not by me though); it is far from trivial in this context. Normally you would do something in screen space, but a path tracer typically doesn't have a single depth per pixel (which is needed to find geometry edges), so that is of limited use. The best solutions so far filter in 'path space' (considering the full set of paths arriving at the camera), but this is compute- and memory intensive.

 

I would expect filtering to bring a significant improvement to path tracing at some point in time, but right now, it is not yet sufficiently researched.



#5 Bacterius   Crossbones+   -  Reputation: 7024

Like
0Likes
Like

Posted 13 November 2013 - 05:38 AM

Really nice demo. Wish I could try it out (AMD here). I'm curious, are you planning on porting the code to OpenCL anytime soon or is it too deeply integrated with the CUDA platform? Also, how do the lens effects work, for instance, at the beginning of the video? Are they path-traced as well, with the optical system part of the geometry, or are they a post-processing step?


"The best comment is a deleted comment."


#6 phantomus   Members   -  Reputation: 593

Like
3Likes
Like

Posted 13 November 2013 - 06:06 AM

Really nice demo. Wish I could try it out (AMD here). I'm curious, are you planning on porting the code to OpenCL anytime soon or is it too deeply integrated with the CUDA platform? Also, how do the lens effects work, for instance, at the beginning of the video? Are they path-traced as well, with the optical system part of the geometry, or are they a post-processing step?

 

OpenCL support is highly desirable, and should be available at some point in time. That, or a generic alternative. A CPU tracer is already available (not in the demo); it produces the same output, but a lot slower (it was not built with CPU performance in mind, but with a focus on maintainability).

 

The post processing (hit F3 in the demo, it's off by default) is just your typical image post processing. The lensflare is pretty accurate; it's based on recent work by Prof. Eisseman of the Delft university. It does a pretty close approximation of physically correct lens flares.



#7 achild   Crossbones+   -  Reputation: 1489

Like
0Likes
Like

Posted 13 November 2013 - 09:30 AM

[...] It is intended for architectural walkthroughs, but something involving games would be even more awesome. As you can see in the demo, we are not there yet, sadly. [...]

 

I don't know... some of the greatest games ever were crafted by limitations. One could use the effect of the path tracer noise intentionally. For example - a guy remotely controlling some machine from hundreds of millions of miles away and the effect you see is because of that. Or a robot with an somewhat injured camera connection. The effect looks really cool if thought of that way...



#8 phantomus   Members   -  Reputation: 593

Like
0Likes
Like

Posted 14 November 2013 - 04:54 AM

For those with fast GPUs, here's an executable that renders at full 1280x800 res, instead of the upsampled 640x400:

https://mega.co.nz/#!ZwBjWChQ!cgoSi1I-7gLKkBooUHFgmQWClwMocl-qqQ8YeiR2vx8

Note that this requires the original package for the data. Also note that this still uses only one GPU; multi-GPU rendering has been disabled in the demo.

Let me know how this runs on your machine. :)



#9 pabloreda   Members   -  Reputation: 476

Like
0Likes
Like

Posted 14 November 2013 - 04:20 PM

Hi Phantomus
great work!!
you see this?
I wonder if this can improve the image.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS