40 TFlops gpgpu in 2017, 128 Gb DRAM, 2 Tb/s bandwidth

Started by
7 comments, last by spek 11 years, 3 months ago

What effects we can get on it (1 gpgpu)?

nvidia+roadmap+2017.JPG

Advertisement

What effects?

http://sydlexia.com/imagesandstuff/snes100/snes17.png

I wouldn't expect too dramatic changes. I remember in 2003 or something, reading here about raytracers becoming the norm in 2012 and such. Well... unless I missed something last year... But if they can release such a system, there are some obvious improvements of course. Bigger numbers means bigger / more detailed textures, easier to allow advanced techniques such as realtime GI in combination with other heavy weights. More possibilities with multithreading / Compute Shaders, meaning even more techniques such as particles, cloth, fluid simulations, other physics or even certain AI routines moving over to the GPU. And... maybe doing more raytraced stuff as well. Though that would be a very fundamental change that I don't see happening quickly in the games-market.

Other than that, it's just guessing. Creativity of the programmers matters most of course. It's still amazing what kind of new smart tricks people come up with on existing already hardware. So who knows what happens. Games will get bigger and more immersive, that's for sure.

My worthless 50 cents :)

Rick

dynamic bvh building ( 1.5 million polygons - 350 ms by 1.5 Tflops ) and ray tracing (Full HD x 30 fps, 2.5 TFlops).
but, not enough for soft shadows in random.

13 ms - bvh building ( 1.5 million polygons ) , 16 ms - ray tracing (depth-8), other for lighting - and, we get 30 fps (for 40 TFlops).

and we must choice - or ray tracing, or motion compensation ( which is 350 ms by 1.5 TFlops ).
last look very attractive.

and I don't see any research about hard gpgpu animation .

What could you do with 40 Tflops?

Well, I suppose all those amazing graphics demos that you see accompanied by white papers, which look amazing but only run at 3fps, would suddenly be showing up in real video games.

Most chip architects think Moore's law has already reached its limits from the papers and youtube videos i've watched. You cant scale down the transistors any smaller without quantum effects and heat issues..

Our current GFLOP/watt ratio is about 3 according this article might be alittle off since it was written 6 months ago..

http://www.hpcwire.com/hpcwire/2012-07-26/the_2012_performance_per_watt_wars.html

That machine is speced at 43 GFLOP/watt at DP.. that's more than 10x the current best. Doable in 5 yrs? Alot of people are working on the exascale computing problem, it will take some serious breakthroughs to break that barrier.

-ddn

Most chip architects think Moore's law has already reached its limits from the papers and youtube videos i've watched. You cant scale down the transistors any smaller without quantum effects and heat issues..

Our current GFLOP/watt ratio is about 3 according this article might be alittle off since it was written 6 months ago..

http://www.hpcwire.com/hpcwire/2012-07-26/the_2012_performance_per_watt_wars.html

That machine is speced at 43 GFLOP/watt at DP.. that's more than 10x the current best. Doable in 5 yrs? Alot of people are working on the exascale computing problem, it will take some serious breakthroughs to break that barrier.

-ddn

I think we are seeing over 10 SP Gflops per watt currently from GPUs. I don't think pushing it to 40 by 2017 is that big of a stretch.

People have been saying we will hit the wall soon for decades, but Moore's Law is still going strong after 40 years. This is because CPU/GPU manufacturers have more tricks up their sleeves than just die shrinking.

[quote name='spek' timestamp='1357468094' post='5018126']
Games will get bigger and more immersive, that's for sure.
[/quote]

I'd like to believe that, but it doesn't seem like games are any longer now than 15 years ago. The graphics are definitely shinier, but that doesn't automatically equal more immersion.

[size="2"]Currently working on an open world survival RPG - For info check out my Development blog:[size="2"] ByteWrangler

Postie, because that was wow effect after 40 thousands years that we live on earth . we was pioneers in the universe, that it was. I'am about universe of real world which is in our consciousness and which is connected with other people . that was the real game in real universe. Now it happen again, if we can put living spirits in game .

@Postie

Maybe not longer in terms of gameplay length (I actually feel many games have become shorter because they're easier, but also it's an insane amount of work to fill a game with the current detail norms). But if you compare the environments from let's say a Crysis2 with its ancenstor Far Cry, we have come a long way. Near photo-realism is no longer just a dream, though pouring it all into a single scene might still be too much for nowadays hardware. Meaning we need to compensate (smaller worlds, lower-res textures, pre-baked solutions, not 100% realtime lighting, simplified reflections, less characters, .... ).

Asides making hardware faster, one may also wonder how the artists can keep up. The days of 2 programmers drawing their own sprites in the attic are over. Now large budgets are needed to ensemble armies of artists to create those immersive environments that perfectly utilize their engines and hardware capabilities... Maybe nVidia & ATI should focus on making artists robots instead :p

This topic is closed to new replies.

Advertisement