Arauna2 path tracer announcement

Started by
7 comments, last by pabloreda 10 years, 5 months ago

Hi all,

I posted a demo of the new Arauna2 renderer on ompf2.com:

http://ompf2.com/viewtopic.php?f=5&t=1887

A youtube of the demo can be found here:

[youtube]http:

[/youtube]

A higher quality version of the video can be downloaded as well (warning: large; 850Mb):

https://mega.co.nz/#!pgJQUJ4Z!XaJIW0B_BMha39FIk8TsOk2CV2RTw5wRPEpDbrVeNBk

And finally, the demo, which requires a decent CUDA device and 64-bit windows:

https://mega.co.nz/#!UphmTTxb!JAfhjCfDa3-zDo3Zy5gD9EArYxlkJKa10XPqZY8sZRs

Arauna2 is an interactive path tracer, which produces images using a random process. As a result, images contain noise, which will fade away as the pixel values approach the 'expected value'. Arauna2 has been optimized to produce decent quality images very quickly, and converges to high quality images in seconds. It is intended for architectural walkthroughs, but something involving games would be even more awesome. As you can see in the demo, we are not there yet, sadly.

I believe Arauna2 is currently the fastest path tracer, but that is of course strongly dependent on requirements. Main limitation is the fixed shading path, which supports the full Phong model, specular, dielectrics, textures, normal mapping and emissive surfaces, as well as every sensible combination of those. Illumination is provided using light emitting surfaces, meshes, point lights, spotlights (including IES profile support) and directional lights.

- Jacco.

Advertisement

Hi all,

I posted a demo of the new Arauna2 renderer on ompf2.com:

http://ompf2.com/viewtopic.php?f=5&t=1887

A youtube of the demo can be found here:

[youtube]http:

[/youtube]

A higher quality version of the video can be downloaded as well (warning: large; 850Mb):

https://mega.co.nz/#!pgJQUJ4Z!XaJIW0B_BMha39FIk8TsOk2CV2RTw5wRPEpDbrVeNBk

And finally, the demo, which requires a decent CUDA device and 64-bit windows:

https://mega.co.nz/#!UphmTTxb!JAfhjCfDa3-zDo3Zy5gD9EArYxlkJKa10XPqZY8sZRs

Arauna2 is an interactive path tracer, which produces images using a random process. As a result, images contain noise, which will fade away as the pixel values approach the 'expected value'. Arauna2 has been optimized to produce decent quality images very quickly, and converges to high quality images in seconds. It is intended for architectural walkthroughs, but something involving games would be even more awesome. As you can see in the demo, we are not there yet, sadly.

I believe Arauna2 is currently the fastest path tracer, but that is of course strongly dependent on requirements. Main limitation is the fixed shading path, which supports the full Phong model, specular, dielectrics, textures, normal mapping and emissive surfaces, as well as every sensible combination of those. Illumination is provided using light emitting surfaces, meshes, point lights, spotlights (including IES profile support) and directional lights.

- Jacco.

last weaks i found myself interested in raytracing and about a week ago incidentally i got know about aruna realtime pathtracer (though i cannot run it on my terribly old machine)

I am fan of it - great work, (Besides i hope that home computers will be 2, 5, 10 times faster in succesive years

though the 10x factor sadly I think can take many years :CC

Have you or will you plan to implement any type of noise filtering? e.g. random parameter filtering looks interesting:

However, Sam Lapere, who's working on the Brigade engine, i'm pretty sure said that rpf doesn't really provide good results, but I would like to see some proof of that.

I think you worked on the Brigade engine as well didn't you?

Have you or will you plan to implement any type of noise filtering?

Filtering is being researched at the moment (not by me though); it is far from trivial in this context. Normally you would do something in screen space, but a path tracer typically doesn't have a single depth per pixel (which is needed to find geometry edges), so that is of limited use. The best solutions so far filter in 'path space' (considering the full set of paths arriving at the camera), but this is compute- and memory intensive.

I would expect filtering to bring a significant improvement to path tracing at some point in time, but right now, it is not yet sufficiently researched.

Really nice demo. Wish I could try it out (AMD here). I'm curious, are you planning on porting the code to OpenCL anytime soon or is it too deeply integrated with the CUDA platform? Also, how do the lens effects work, for instance, at the beginning of the video? Are they path-traced as well, with the optical system part of the geometry, or are they a post-processing step?

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Really nice demo. Wish I could try it out (AMD here). I'm curious, are you planning on porting the code to OpenCL anytime soon or is it too deeply integrated with the CUDA platform? Also, how do the lens effects work, for instance, at the beginning of the video? Are they path-traced as well, with the optical system part of the geometry, or are they a post-processing step?

OpenCL support is highly desirable, and should be available at some point in time. That, or a generic alternative. A CPU tracer is already available (not in the demo); it produces the same output, but a lot slower (it was not built with CPU performance in mind, but with a focus on maintainability).

The post processing (hit F3 in the demo, it's off by default) is just your typical image post processing. The lensflare is pretty accurate; it's based on recent work by Prof. Eisseman of the Delft university. It does a pretty close approximation of physically correct lens flares.

[...] It is intended for architectural walkthroughs, but something involving games would be even more awesome. As you can see in the demo, we are not there yet, sadly. [...]

I don't know... some of the greatest games ever were crafted by limitations. One could use the effect of the path tracer noise intentionally. For example - a guy remotely controlling some machine from hundreds of millions of miles away and the effect you see is because of that. Or a robot with an somewhat injured camera connection. The effect looks really cool if thought of that way...

For those with fast GPUs, here's an executable that renders at full 1280x800 res, instead of the upsampled 640x400:

https://mega.co.nz/#!ZwBjWChQ!cgoSi1I-7gLKkBooUHFgmQWClwMocl-qqQ8YeiR2vx8

Note that this requires the original package for the data. Also note that this still uses only one GPU; multi-GPU rendering has been disabled in the demo.

Let me know how this runs on your machine. :)

Hi Phantomus
great work!!
you see this?
I wonder if this can improve the image.

This topic is closed to new replies.

Advertisement