• Advertisement
Sign in to follow this  

Quadtree entirely on GPU

This topic is 2647 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

A while back someone commented on one of my planet videos, to ask if I was managing the quadtree for LOD on the CPU or entirely on the GPU via the geometry shader. At the time I didn't give it much thought, because I still needed CPU-side chunk generation.

Now that tessellation shaders are here, I want to revisit this, to see if I can move the entire of LOD management and terrain synthesis to the GPU - but I can't find any references to this technique, and I can't quite hit on a feasible implementation.

My initial idea was to maintain a vertex buffer containing just the leaves of the quad-tree. Each frame, I run a geometry shader over the vertex buffer, and it checks the LOD and splits a given quad if it doesn't match the desired LOD, with the results going to a second vertex buffer via transform feedback/stream out. While this works for node splits, it doesn't work for node combines, because you don't have any information about which parent node should be re-created.

My second thought is to store the entire quad-tree in the vertex buffer, run the geometry shader over to split/delete as needed, and then run a second geometry shader pass to output just the leaf nodes for rendering. Does this sound feasible, or am I going to consume far too much GPU-power in checking/discarding parent nodes?

Share this post


Link to post
Share on other sites
Advertisement
Sign in to follow this  

  • Advertisement