Particle RenderStates

Started by
3 comments, last by Hibbs 18 years, 10 months ago
Hi, Ive writing my first particle engine, and am nearing completion. Im using quad billboards with alpha textures - positioned using a shader program. I had set ZWriteEnable to FALSE while i tested the code. Now that im rendering with several emitters i find that the particles dont blend together in the scene correctly. i.e the last rendered emitter's particles always overwrite others even if they are further from the camera. I set ZWriteEnable back to TRUE but its seems that the alpha parts of the billboards get written to the Z-buffer and the alpha parts block out other particles. Does anyone know how to overcome this obsticle?
Advertisement
You will have to manually pre-sort the particle emitters each frame and render them in a back-to-front order with Z buffer writing disabled after all other geomtry has been rendered.

As the particle emmitters are an infinitely small point, sorting them simply depends on the distance between the camera position and their world space position. This can be optimised if you employ any hierarchal spacial system such as quad or BSP trees.

- Oscar [smile]
dosnt that mean that i may have to make multiple draw calls on the same particles if other particles with different textures intersect the same area?
this is still a problem if more than one emitter overlaps, this would mean you need to interleave the two particle systems.

Basicly the only way to render interleaved particles with different textures is to combine all the textures into a texture atlas [a single texture containing a number of textures], for example

0----U coord-----> 1| [t1][t2][t3][t4]| [t5][t6][t7][t8]v [t9][..][..][..]1 [  ][  ][  ][  ]


thus allowing you to use differnet textureing for different particles whilst still using a single vertex array.

Also aside from just combining particle systems of differnet textures you can also combine additive blending and alpha channel based blending by using textures with premultiplied alpha.

[additive blending (sparkly glowing things typically)]
[alpha channel blending (things like smoke that are more solid)]

to do the combined blending you'll need to set the blend function to the following:

glBlendFunc( GL_ONE, GL_ONE_MINUS_SRC_ALPHA );

and the textures need to be created as follows:

say you want a texture of a red circular gradient, solid red at the centre fading to trasparent at the edge. the initial RGB channel will be red, and the alpha would be a radial gradient, black at teh edge, white at centre. To make this a premultiplied texture the RGB values are multiplied by the alpha meaning that the RGB channels would now represent a red to black radial gradient. i believe that paint shop pro and photoshop both do this if you simply use their default layer transparency.

http://www.td-grafik.de/ext/xfrog/alpha/ this might be a good read if you feel this idea would be useful for you.

Then to render a particle as additive blending, you simply set the quad colour to have an alpha of 0, the alpha channel is ignored and the normal additive blend is performed, leave the alpha channel as 1 and you will get modulated blending.

Twitter: [twitter]CaffinePwrdAl[/twitter]

Website: (Closed for maintainance and god knows what else)

I like the idea of combining the textures into a single tiled texture atlas (so to speak). although my emiiters can contain several types of particles each with changing variables (color, alpha, blendtype, rotation, size, texture, velocity, age ect ect) and each if these variables can vary over life and have a random generation. so i dont think pre-multipled alpha will work for me.

Is it possible to create these atlas textures at runtime as i create the emitters and load the individual textures, or will i have to create them externally?

This topic is closed to new replies.

Advertisement