Ray tracing 60 FPS on Tablet.

Started by
8 comments, last by RandomDudeFromPrague 6 years, 1 month ago

Hi , I was considering this start up http://adshir.com/, for investment and i would like a little bit of feedback on what the developers community think about the technology.

So far what they have is a demo that runs in real time on a Tablet at over 60FPS, it runs locally on the  integrated GPU of the i7 . They have a 20 000 triangles  dinosaur that looks impressive,  better than anything i saw on a mobile device, with reflections and shadows looking very close to what they would look in the real world. They achieved this thanks to a  new algorithm of a rendering technique called Path tracing/Ray tracing, that  is very demanding and so far it is done mostly for static images.

From what i checked around there is no real option for real time ray tracing (60 FPS on consumer devices). There was imagination technologies that were supposed to release a chip that supports real time ray tracing, but i did not found they had a product in the market or even if the technology is finished as their last demo  i found was with a PC.  The other one is OTOY with their brigade engine that is still not released and if i understand well is more a cloud solution than in hardware solution .

Would there  be a sizable  interest in the developers community in having such a product as a plug-in for existing game engines?  How important  is Ray tracing to the  future of high end real time graphics?

Advertisement

If you have the money you can invest but be warned, what they are showing there isn't that impressive and some of it is redundant.

I did something similar with some tanks and all the real time graphics used by games can be used here, the fact that they are using ray tracing is overkill.

 

The main problem with this AR technology and the same problem they have is keeping track of where things are. You will notice that they only show from one angle as to allow the tablet to track there marked light sources and they keep the table edge in view to help align the surface.

In short: Tablet hardware doesn't allow for 1:1 tracking and a computer has a hard time working out 3D from the 2D image. There are ways around this but tablets isn't that powerful yet.

 

Your investment will help them but don't expect profit, so do it to help them.

Thanks for your feedback, the technology is the ray tracing graphics engine, all the other features are done by unity's AR platform.

So ray tracing would just be overkill for gaming today? even if it provides accurate global lighting?

Note that the dinosaur shows just simple lighting, no global illumination. Thus 60 fps are not impressive but expected.

AFAIK, Brigade is planned to be released for Unity this year (don't know if for free, but guess so). OTOY works on multiple path tracing renderers, for both offline and realtime, Brigade is targeting realtime.

With recent advances about denoising, path tracing is very close to realtime on high end GPUs, see e.g. this work with GI:

Many people work on this actually, many will just implement on their own when time is ready. Hard to tell if AAA studios will use middleware or do it themselves (i guess the latter).

I'd be inclined to say (in absence of a more impressive demo), that they are just selling smoke and mirrors. Single reflection and 3 diffuse lights is trivial without ray tracing. Unless they can show off some sort of real-time global illumination, they aren't demonstrating anything new here.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

I did a close look on the webpage.

They compare their realtime tracer with an offline renderer and claim a huge speedup. (The offline renderer is made to handle extremely complex scenes AND global illumination, they show neither of that.)

They claim a new solution to the problem of building acceleration structures. But i don't need to rebuild the acceleration structure for that dino - i only need to refit it which is a fast operation. (It's common misbelief building acceleration structures would be the main problem of raytracing. The real problem is random memory access and caching.)

 

3 hours ago, Yosef BenSadon said:

all the other features are done by unity's AR platform.

Thanks for telling me this, last time I used Unity for AR it was really buggy; looks much better now.

3 hours ago, Yosef BenSadon said:

So ray tracing would just be overkill for gaming today? even if it provides accurate global lighting?

Yes. with Unreal you can just use the input from the camera as a texture for your Floor/Wall* and take full advantage of the reflective environment with global lighting. Because the AR scenes are often small, like a table top you can really push the settings.

You don't NEED 1:1 ray tracing, most people won't even notice.

 

And like @swiftcoder said, as much as they did here could have been done without ray tracing. You could just have flipped the dinosaur upside down and made it a bit transparent; there are a million faster ways to do this.

Ray tracing would be nice for glass like materials and other complex materials.

 

*Floor/Wall does it even have a name? I don't know but it looks like this:

FloorWall.jpg.9bb8441fabf43e33351fd324cf7bd02d.jpg

The smooth corner prevents sudden changes in the shadow and reflections.

5 hours ago, Yosef BenSadon said:

So ray tracing would just be overkill for gaming today? even if it provides accurate global lighting?

Missing realtime global illumination is the main reason games don't look real, so any technique that solves this is welcome.

Pathtracing can produce realistic results, and the simplicity of the algorithm makes it very attractive because it can handle all optical phenomena with one approach. We are just so used to the thought pathtracing is too expensive that we tend to act critical, but the revolution will happen. Allthough probably in small steps and in combination with other promising techniques (that's what i think - others think pathtracing will just replace current game graphics pretty soon).

Raytracing is already used in games, e.g. accurate shadows in the Division. Voxel cone tracing (Children Of Tomorrow) is also a form of ray tracing and gives soft and sharp reflections at limited accuracy, so almost complete GI. (Infinite bounces are missing but theoretically possible.)

Pathtracing needs a lot more rays. Not just one (or a few) like for point light shadow or a perfect mirror reflection. It needs 100s or 1000s to capture the full enviroment. That's out of reach, but denoising can help. It borrows rays from nearby locations and from cached previous state. Naive example: 16*16 pixel neighbourhood * 16 stored frames = 4096 rays, so enough. This is why we will change our minds about path tracing. (The downside is again inaccuracy, for instance sharp reflections get blurry.)

 

23 hours ago, JoeJ said:

Pathtracing can produce realistic results, and the simplicity of the algorithm makes it very attractive because it can handle all optical phenomena with one approach.

Well, regular path tracing is not so great with caustics. Dispersion also usually requires some additional hacking.

 

23 hours ago, JoeJ said:

Pathtracing needs a lot more rays. Not just one (or a few) like for point light shadow or a perfect mirror reflection. It needs 100s or 1000s to capture the full enviroment. That's out of reach, but denoising can help. It borrows rays from nearby locations and from cached previous state. Naive example: 16*16 pixel neighbourhood * 16 stored frames = 4096 rays, so enough. This is why we will change our minds about path tracing. (The downside is again inaccuracy, for instance sharp reflections get blurry.)

 

While this is correct, the term "denoising" is usually applied to various post processing effects for noise reduction like intelligent blurring, etc. These can significantly reduce image noise in relatively short amount of time and are more suited to GPU.

This topic is closed to new replies.

Advertisement