Particle Systems & Handling Transparency

Started by
7 comments, last by Madhed 9 years, 4 months ago

Hey everyone,

I've been studying OpenGL/Graphics for a short while now and I'm in the process of working on a simple rendering pipeline for my (basic) game engine. From what I understand (and please correct me if my understanding is not entirely accurate), the high level overview of a simple rendering pipeline might look like this:

- Add all non-transparent objects to a queue, and sort them front to back based on their distance from the camera (as fragments are discarded if they fail the depth test).

- Or instead do some sorting based on which shader and textures various objects use to minimize OpenGL state changes. (I'm not entirely sure which sorting method to use for the best performance but this isn't the main issue here).

- Add transparent objects to a 2nd queue, and sort them back to front based on their distance from the camera (so the proper effects show up on screen as a result of the transparency).

- Cull any objects (as they are added to the queues) that won't be seen anyway so there aren't any wasted draw calls. (As far as I know, primitives are discarded if they aren't within screen space anyways but at that point you've already spent some processing time calling draw and running the vertex shader so it ends up being worth it).

- Render all of the non-transparent objects in the queue, and then render all of the transparent objects in the other queue.

I understand there are some issues with drawing transparent objects, and that's what's led me to post this today. My question is as follows:

- How do you handle rendering multiple particle systems that are working with different textures? It seems like a clean (object oriented) solution to have one particle system responsible for a single type of particle, but then sorting the particles properly becomes a problem when you introduce multiple particle systems. If you had every single particle in one container, then it seems like you have even more problems because different particle systems may have different data requirements.

I'm sure there's a simple and elegant solution here that I can't see due to my lack of experience with graphics programming. Any ideas/suggestions?

Advertisement

Once you start looking for it, you'll see incorrectly sorted particles in a lot of very high quality AAA games, it's surprising how much incorrectness you can get away with.

One solid solution which I favour is to have all your particle textures on a single texture sheet. It's a bit limiting in terms of texture space (less so if your target platforms support texture arrays), but it means you can have a single correct sort for all your particles and render the lot with a single draw call (it is possible to draw additive and alpha blended particles in a single draw call). Particles can be perfectly sorted with one another, but aren't sorted with other transparent objects in the scene.

The more common solution is to just make do with a coarse sort. Rather than managing your particles purely by their type, you separate them according to the instance, sort the instances and render the instances according to their depth. So the particles from explosion #1 are managed separately form the particles from explosion #2. Arranging things this way might make more sense when you want to manipulate the particles, e.g. immediately destroy particles from explosion #1, or attach explosion #2 to a moving object. Plus by managing by instance this way, it becomes feasible to frustum cull particle effects.

PS: Your summary of the rest of the pipeline seems pretty good. I tend to make a distinction between 3 types of transparency, Decals (i.e. transparent stuff that's rendered just over opaque stuff, like blob shadows, bullet holes), 1-bit (e.g. foliage that uses alpha discard but no translucency), 8-bit (e.g. particles). Then you render in the order, opaque->decals->1-bit->8-bit. Only the 8 bit stuff needs to be sorted by depth.

A strategy I've tried is to use polymorphism to create the different types of particles(if you need different types), but loading them all into the same vector or struct and then sorting that instead. So, essentially just using one container for all particles.

For some of the more nuanced particle problems like smooth particles and such, this tutorial I've found helpful, but haven't implemented a lot of the strategies myself, so can't really speak to their usefulness. Scroll down to the section on "handling multiple particle systems,"

Usually, though, like Columbo mentioned, I just let it be. Personally, I haven't gotten a game to the level of polish where conflicting particles is really an issue.

Beginner here <- please take any opinions with grain of salt


One solid solution which I favour is to have all your particle textures on a single texture sheet. It's a bit limiting in terms of texture space (less so if your target platforms support texture arrays), but it means you can have a single correct sort for all your particles and render the lot with a single draw call (it is possible to draw additive and alpha blended particles in a single draw call). Particles can be perfectly sorted with one another, but aren't sorted with other transparent objects in the scene.

I might look into this later if I have more time and I've managed to set up a basic system for everything. But for now, I'll probably be using multiple draw calls and individual textures for simplicity (and prototyping) sake.


PS: Your summary of the rest of the pipeline seems pretty good. I tend to make a distinction between 3 types of transparency, Decals (i.e. transparent stuff that's rendered just over opaque stuff, like blob shadows, bullet holes), 1-bit (e.g. foliage that uses alpha discard but no translucency), 8-bit (e.g. particles). Then you render in the order, opaque->decals->1-bit->8-bit. Only the 8 bit stuff needs to be sorted by depth.

That's a good point. I'll keep that in mind.


For some of the more nuanced particle problems like smooth particles and such, this tutorial I've found helpful, but haven't implemented a lot of the strategies myself, so can't really speak to their usefulness. Scroll down to the section on "handling multiple particle systems,"

Yeah, the information on that website was partly what led me to investigate further as to how I'm going to handle these systems before I try implementing them. I also stumbled upon this and it seems like it is another viable option.

Thanks for the input guys. I have a much better idea as to what my options are now.

(it is possible to draw additive and alpha blended particles in a single draw call).

How do you do that? (I'm an OpenGL beginner)

(it is possible to draw additive and alpha blended particles in a single draw call).

How do you do that? (I'm an OpenGL beginner)

I suppose with the blend function: Dest * (1 - SrcAlpha) + Src

Aka premultiplied-alpha.

If SrcAlpha is 0, it simply reduces to additive blending.

If SrcAlpha is between 0 and 1 you can achieve alpha blending but you have to premultiply your textures by the alpha value first.

This also allows you to seamlessly blend from explosion to smoke and other cool effects.

(it is possible to draw additive and alpha blended particles in a single draw call).

How do you do that? (I'm an OpenGL beginner)

I suppose with the blend function: Dest * (1 - SrcAlpha) + Src

Aka premultiplied-alpha.

If SrcAlpha is 0, it simply reduces to additive blending.

If SrcAlpha is between 0 and 1 you can achieve alpha blending but you have to premultiply your textures by the alpha value first.

This also allows you to seamlessly blend from explosion to smoke and other cool effects.

Yep, that's it.

Thanks gents, that's interesting. I'm not trying to derail the topic too much, but wouldn't that mean you can't use alpha with additive blending? How then do you fade out your additive particles? The vertex color's alpha component?

Tween vertex color to RGBA(0,0,0,0) this gets rid of both terms

This topic is closed to new replies.

Advertisement