• Advertisement
Sign in to follow this  

Nvidia GI Hardware Support

This topic is 1245 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I saw the Nvidia moon landing GI demo and the corresponding GI demo on the Nvidia youtube channel.

 

They mention this is coming with their new gpu, is this some hardware supported feature? Does anyone know how it will integrate with other software if that is the case?

Also, in the demo with the sphere, they show the voxel data. I'm assuming that the pixel shader performs a raycast into the voxel data to find the closest voxel? I've never messed with voxels before. I'm not sure how that works. Is the data on the gpu a 3d texture and it will just ray cast  from the fragments world position + normal until it hits a non blank texel in the 3d texture and then use that information?

If so, wouldn't that 3d texture be pretty big? for that small scene it looks like it is about 64x64x64 if it is a 3d texture. Maybe I just don't know enough about voxels. I remember the latest Crysis tech pdf mentioning their GI stuff. Were they doing the same type of thing then?


Edited by dpadam450

Share this post


Link to post
Share on other sites
Advertisement

They use spase volume texture to store virtually huge texture with lot of "blank" Inside. It's a DX12 feature.

Share this post


Link to post
Share on other sites

  

Actually, they are using Voxel Global Illumination (VXGI) which seems to be very similar to Sparse Voxel Octree Global Illumination (SVOGI) which was first published by Cyril Crassin of NVIDIA in 2011.

 

From the NVIDIA page on the technology behind VXGI:

 

In 2011, NVIDIA engineers developed and demonstrated an innovative new approach to computing a fast, approximate form of global illumination dynamically in real time on the GPU. This new GI technology uses a voxel grid to store scene and lighting information, and a novel voxel cone tracing process to gather indirect lighting from the voxel grid. NVIDIA’s Cyril Crassin describes the technique in his paper on the topic and a video from GTC 2012 is available here. Epic’s ‘Elemental’ Unreal Engine 4 tech demo from 2012 used a similar technique.Since that time, NVIDIA has been working on the next generation of this technology—VXGI—that combines new software algorithms and special hardware acceleration in the Maxwell architecture.

 

 

VXGI is not a Direct3D 12 specific feature, in fact, SVOGI it was first implemented in OpenGL 4.x and Direct3D 11. Though the technology has (mostly) been available for years, NVIDIA's Maxwell graphics architecture is the first of its kind to have "built-in" support for the technology.

Share this post


Link to post
Share on other sites

1)They mention this is coming with their new gpu, is this some hardware supported feature?

2)Does anyone know how it will integrate with other software if that is the case?

2) nVidia has become a middleware vendor now. Your business people get in touch with their business people, you sign NDAs, then a lot of money and/or contracts change hands and then you get access to a little middleware library...

As for (1) my guess would be what vlj said -- that it's restricted to their new GPUs as these are their first to implement sparse volume textures. Their library will use some vendor hack to make use of this feature in D3D11.

@Josh - yes SVOGI runs on current hardware, but nVidias new middleware library (VXGI) apparently doesn't... Which is hat makes me think they're relying on sparse volume texture support (which is listed as a new D3D12 feature by Microsoft, but is implemented in these new cards via a GL extension) in their algorithm.

@dpadam450 - look up the sparse voxel octrees cone tracing Gobal ilumiation papers to understand the algorithm.
If my guess is correct, VXGI is basically the same, but replacing the octree with a plain 3D grid (or maybe a hierarchy/cascade of grids), but using a hardware feature where empty parts of the world consume no memory.

Share this post


Link to post
Share on other sites

I'm confused. Nvidia supports GL_ARB_sparse_texture with both Kepler and Fermi, so why is this a Maxwell only feature?

Edited by Chris_F

Share this post


Link to post
Share on other sites

I'm confused. Nvidia supports GL_ARB_sparse_texture with both Kepler and Fermi, so why is this a Maxwell only feature?

They're probably using GL_EXT_sparse_texture2.

 

Who would actually use this middleware though? If VXGI only supports the very latest nVidia GPUs, then you still need to implement your own GI system yourself to support older nVidia cards, and ATI cards, and Intel users... Bit of a poisoned chalice really.

Edited by Hodgman

Share this post


Link to post
Share on other sites

 

I'm confused. Nvidia supports GL_ARB_sparse_texture with both Kepler and Fermi, so why is this a Maxwell only feature?

They're probably using GL_EXT_sparse_texture2.

 

Who would actually use this middleware though? If VXGI only supports the very latest nVidia GPUs, then you still need to implement your own GI system yourself to support older nVidia cards, and ATI cards, and Intel users... Bit of a poisoned chalice really.

 

 

I did understand that it does work on other hardware but only with a lot lower speed.(3X without new hardware features)

Share this post


Link to post
Share on other sites
it's not a maxwell only feature, it runs on fermi hardware already. it was previously called giworks: https://developer.nvidia.com/gi-works

maxwell got optimized hardware for some of the voxel stuff. in their whitepaper nvidia mentions 'conservative rasterization' and 'multi viewport rendering'. both is possible on older hardware using geometry shaders, yet we all know how geometry shader perform sad.png

Share this post


Link to post
Share on other sites

I'm confused. Nvidia supports GL_ARB_sparse_texture with both Kepler and Fermi, so why is this a Maxwell only feature?

 

GL_ARB_sparse_texture extension only concerns 2D texture.

Share this post


Link to post
Share on other sites

 

I'm confused. Nvidia supports GL_ARB_sparse_texture with both Kepler and Fermi, so why is this a Maxwell only feature?

 

GL_ARB_sparse_texture extension only concerns 2D texture.

 

 

Read the spec. It supports 2D textures, array textures, cube maps, cube map arrays, 3D textures and rectangular textures.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement