Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Verd

Ray Tracing in the future?

This topic is 5302 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

In the coming years, say 5 to 10 years, will ray tracing become the prominent rendering method as opposed to polygonal rendering? With CPU speeds taking leaps and bounds, could it be in the next couple of years? Or will hardware based GPU polygonal rendering continue to be the main rendering method? I am doing a speech on Ray Tracing in the next coming weeks, and if it will become wide spread. Im just wondering what the experts think about this. What is the future of ray tracing?

Share this post


Link to post
Share on other sites
Advertisement
yes. it will.




If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
Speaking as one who is very qualified to comment on such matters, yes, raytracing will be the graphics rendering method of the future - specifically, realtime photon mapping.

There have been a lot of discussions about this topic here on GDnet; some of the better ones can be found here and here. There''s also a bit on radiosity here, which includes a sometimes less-than-civil debate on how raytracing is better than radiosity.

Enjoy

Share this post


Link to post
Share on other sites
Take a look at the Realstorm Raytracer. It''s a realtime-raytracer demo. Just take a look at the site, and try out their demo.

My machine (XP2800+,1GB-DDR333) gets an average of 2.8 fps at a resolution of 640x360. Not quite usable for games or other applications just yet. Even more so, if you take into account that most games also need things like physics and ai, which require their share of cpu power.

Then take a look at one of nvidia''s current demos like Last Chance Gas. If you don''t own a GF-FX take a look at the video they provide.

My idea is that cards like the Radeon or GeForce will slowly adobt more and more features only found in raytracers a few years before, increasing the visual realism to a level where you can''t distinguish between raytracing and realtime rendering.

Also, why should you let all the processing power included in such cards go to waste and return to cpu-only dependand raytracers? I don''t think processor speed will take such a leap that it can do the same demo in 1024x768 @ >100 fps in the near future. And even if, GPUs will have developed too, providing visual results close to that of a raytracer and freeing up cpu cycles that can go towards more realistic physics for example.

Share this post


Link to post
Share on other sites
Don''t get too excited just yet, though. I''m pretty sure the hybrid approach will remain far more usable than plain raytracing for quite a while.

- JQ

Share this post


Link to post
Share on other sites
quote:
Original post by Wildfire
Also, why should you let all the processing power included in such cards go to waste and return to cpu-only dependand raytracers? I don't think processor speed will take such a leap that it can do the same demo in 1024x768 @ >100 fps in the near future.


*hint* Hardware raytracing */hint*

SaarCOR

I dunno how long it will take, but I'm pretty sure HW raytracers will surpass rasterisers at some point


[edited by - Eternal on November 17, 2003 4:43:18 AM]

[edited by - Eternal on November 17, 2003 4:44:12 AM]

Share this post


Link to post
Share on other sites
quote:
Original post by Eternal
*hint* Hardware raytracing */hint*



That's what I was actually trying to get at with the paragraph above that

quote:

My idea is that cards like the Radeon or GeForce will slowly adobt more and more features only found in raytracers a few years before, increasing the visual realism to a level where you can't distinguish between raytracing and realtime rendering.



Of course, I'll take a while before you see that Or maybe not I just doubt that we'll go back to single cpu solutions as opposed to cpu/gpu solutions very soon. I think neither Ati nor nVidia want to go out of bussiness and thus will keep improving their cards, in whatever direction will give better visual results then any software raytracer can do in realtime at the same framerate and resolution.

//edit: My messing is spelled up. Me tired. %|

// edit2: I forgot to mention: The above benchmark results are with radiosity and depth of field disabled. I never had the patience to run the demo with that because it slows down things at least by a factor of 10.

[edited by - Wildfire on November 17, 2003 5:09:45 AM]

Share this post


Link to post
Share on other sites
Id say about 5 years after the likes of Pixar and PDI stop using rasterization as their primary method of rendering seems a reasonable guess (so 5-10 sounds about right).

Wether it will be done on CPUs or GPUs depends on where the market for PCs is going. CPUs designed primarily to run legacy code are lame ducks compared to more parallel designs such as GPUs (even with huge process and R&D advantages). If the mainstream CPUs start being designed primarily for parallel processing and for legacy processing second then maybe the GPU tasks can be reintegrated with the CPU.

Sony is moving their console in that direction with Cell, but PCs are not at the moment.

Share this post


Link to post
Share on other sites
I won''t be surprised if raytracing becomes the way of realtime graphics in the future.

I remember Yu Suzuki wanting to use raytracing to fully experience Shenmue on the Dreamcast. But as he said, at that time, this was not possible. But i doubt it will replace rasterization in the next 5-10 years.

Share this post


Link to post
Share on other sites
WildFire: you should get much bether results actually, i think.

WildFire: the nvidia demo is just a plain hack. there is nothing real usable about it. i hope you know that. (guess why the file is that big..)

PinkyAndThaBrain: rasterization stopped to be the single way to render for Pixar and friends yet quite a while ago. Ice Age was the first full-raytraced movie.

WildFire/Pinky and all others: RealStorm performs very bad on pentiums, as its really not written for it. so we can''t see if hyperthreading could help. but if you read about the future of cpu''s, you know that amd''s next plans, k9, are planned to be in 2-3 years. the main difference to now will ONLY be that those k9 are just like k8 (athlon64, opteron), but several of them in one chip. up to 32 they plan. similar to hyperthreading, but more split => more behaving like real 32 parallel cpu''s.

intel plans the same. hyperthreading gets extended to "real dual cpu in a chip". and it gets extended to 4, 8, ... cpu''s in a chip. this is all plans of the next some years..

and realstorm normally performs much bether than what WildFire reports actually. you sound more like a 2000+ system. i''m still waiting for the first 3200+ a64 results.


with those next gen cpu''s, wich are planned over the next years, i think yes, in 5 years, raytracing is very simple doable on cpu''s.





oh. and i''m very against supporting todays hw to evolve more. reasoning is simple: they split more and more completely away from being part of your system. think about it. they have their own memory, their own chips, their own datamanagement. and they have a very fast lan-like connection to the system (or merely adsl), called agp. this system is VERY inefficient. just think how powerful your gpu could be if you could plug it into the second slot of a dual opteron mainboard. direct memory access to the system, direct talking with the cpu, etc. full memory bandwith all the time. no need to TRANSFER memory at all.

it would perform much bether.

thats why i''m more for a multiprocessor system, and the gpu just as another processor, optimized for stream-tasks.

there are cases where IGP beats even todays leader gpu''s. and that is, when they have to share and co-work with the cpu.

the hw gpu design limits. not only because it rasterices. but, too, because it does. of course.

raytracing is way to go. and it will not happen on gpu''s. just because they are too proprietary anyways.


thats my guess.




If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!