Rendering alpha and non-alpha objects

Started by
2 comments, last by Muncher 18 years, 3 months ago
To render alpha and non-alpha objects properly I've created 2 render lists consisting of alpha meshes and non-alpha meshes. Moreover, I've also created an alpha material list and a non-alpha material list. In my render loop, I first go through my non-alpha materials, set the texture and then loop through my non-alpha mesh list and render the meshes. However, this approach will not work with alpha meshes will it? I would, in fact, have to reverse the process and first go through my alpha-meshes (depth sorted) and then loop through my alpha-material list and set the corresponding texture isn't it? Or is there a better way to go about this? EDIT: I forgot to ask this one: On what basis do you sort the alpha meshes? I read somewhere that I need to sort them in eye-space. What on earth does that mean? I do know that I need to sort them according to the direction my camera is facing but I'm not entirely sure how to go about doing this... [Edited by - Specchum on January 12, 2006 6:43:21 AM]
"There is no dark side of the moon." - Pink Floyd
Advertisement
You are essentially correct.

In order to get alpha blended polygons to render properly you need to first render all the opaque meshes (preferably sorted by material type), and then you need to render all the alphaed polygons sorted by depth. The simplest way to do the sorting is probably to transform the position of the centre of each alphaed polygon into camera space, and then sort them according to depth (typically the z-coord) from furthest (largest z) to nearest (smallest z). Polygons with negative z are behind the camera and don't need to be rendered, similarly ones with very small z are very close to the camera and may be clipped out.

Note that ideally all the alphaed polygons should be sorted together, including both those in meshes and those due to particle effects. The sorting should be done on a per-poly basis rather than just sorting the meshes.

Now what this means is that there will potentially be lots of material changes when rendering the alphaed polygons, as they can't be batched by material type. The best way to minimise this is to try to reduce the number of alphaed polygons and to squeeze as many of the alphaed textures as possible onto the same texture page.

In order to reduce the number of alpha blended polygons in the second render list, note that 1-bit alphaed textures can be included in the first render list along with all the opaque textures. This will work so long as the alpha test is enabled, but the alpha blending disabled for the 1-bit alphas. Since most foliage, metal grilles, railings etc can use 1-bit alpha, this only leaves glass and particle effects in the second render list.

Martin
Thanks for the answers Martin. Makes mucho senso! Time to get cracking then!
"There is no dark side of the moon." - Pink Floyd
hi,

i implemented a transparent mesh renderer not that long ago and came across the dreaded z-sorting issue you have realized :)


One solution that requires no sorting, and is entirely done on the graphics hardware is depth-peeling. Basically, you render the scene multiple times, using the z-buffer to slice the scene for you, then you compose the rendered images from back to front.

The article is here: http://developer.nvidia.com/object/Interactive_Order_Transparency.html

This topic is closed to new replies.

Advertisement