# OpenGL Accelerating interleaved Particle Systems

## Recommended Posts

Hey All, ------------------------------------------------------------ Quick Appology - I decided to move this to OpenGL rather than the 'Graphics' forum, since I am using opengl and I need info that is specific to the API.. ------------------------------------------------------------ I am wondering how to render particles where 2 (or more) systems are intersecting. I was going to implement my system using vertex buffers with one buffer per material. But I have thought of a use case in my game where i am likely to have a number of particle systems intersecting. And unfortunately some of the effects will not be possible with additive blending alone. A particular example is a vehicle explosion (the game is a c&c style 3d affair) where there might be a fireball, a central smoke column and smoke trails for various parts blown off. While trying to think of a solution I did consider just leaving it as is, where by I sort each particle system back to front and render the buffers one by one, the actual order of rendering the individual buffers is arbitrary. The problem is that, for example, flames on the floor several yards back from a column of heavy smoke should not be visible (or at least partially obscured) from the camera but depending on the order the vertex buffers are rendered the flame material may end up being rasterised on top of the smoke particles. I realise that if i detect the intersection of two particle systems by bounding box test, i can then prepare them differently so that the two systems are interleaved, sorted and rendered together, but i believe this would prevent me from using any sort of fast rendering, the only way i can think to achieve this is glBegin ... glEnd, but there is no way that this would give me any sort of accepable performance. Anyone know of anyway to either get round the situation or a way to render them at a decent speed? I have been having a look around for info but not been able to find much that addresses this problem, I had looked back through the forums but i appologise if this has been addressed and I missed it. Thanks, Alex

cross post

##### Share on other sites
Hence the apology at the begining...

I am begining to think that the only way to get good results is to render occluded systems individually and then render the remaining systems grouped by materials. building the scene up from back to front. Or just give it up as a bad idea and hope that the resulting scene wont look to bad with a few overlaps.

----------------------------------------------

Just found a way of doing it nicely, http://www.gamedev.net/community/forums/topic.asp?topic_id=216773

basicly if i use premultiplied alpha based images then i dont have to change blend modes and if i place all the particle images into a single texture (either at run-time or before hand) then i can pass all the particles as a single vertex array. cushty!

[Edited by - aleks_1661 on October 5, 2004 5:05:03 AM]

##### Share on other sites
Why not have a particle render manager to do the rendering. The particle manager would just represent the sorts of particle systems at work (smoke, flames) and they would pass positions, sizes and textures of the billboards to the particle renderer. This would then sort (since you're inputting the particles one by one you can use insertion sort, or in any case some fast sorting method) the particles by depth value and then render them.

##### Share on other sites
All that is implemented but the problem I want to try and address is what to do when two particle systems are intersecting, unless you use additive blending (as most systems do for corona's sparks etc) you have to mix the two sets of particles together to avoid one type overdrawing the other incorrectly.

ie imagine looking vertically down at a set of billboarded polys
         --   --      --    ==  ==             ==  == --        ==  ==    --      --   --   --             -- particle type 1       ==    ==                == particle type 2          ^          |        Camera       =-====--==-----    < this shows what type would be visible                                in each column

with out rendering the particles all mixed into one vertex array
you will be essentially doing the following
         --   --      --                --         <= particle type 1 rendered first                  --      --   --   --                ==  ==     ==  ==             render the second particle type.        ==  ==       <= particle type 2 will overdraw type 1       ==    ==                ^           |         Camera       ======-===-==--  < the visible particles     =-====--==-----  < this is what should be seen

If using vertex arrays you can only have one material per buffer, which means i cannot change material mid render. And using glBegin/glEnd is slow.

If i could combine a number of particle textures into a single texture then by varying the texture co-ords i can effectivly render particles with more than one 'material' in a single buffer. And extending that by using the premultiplyed alpha technique mentioned above i can actually give the appearance of more than one material and more than one blend mode per buffer. Though i have read up on premultiplyed alpha and it can give particles a dark edge beause the colours become darker as the alpha value of a pixel decreases.

ok, i think that ascii diagram is going to confuse, ill have to try to make an image to show what i mean..

## Create an account

Register a new account

• ## Partner Spotlight

• ### Forum Statistics

• Total Topics
627684
• Total Posts
2978627
• ### Similar Content

• Both functions are available since 3.0, and I'm currently using glMapBuffer(), which works fine.
But, I was wondering if anyone has experienced advantage in using glMapBufferRange(), which allows to specify the range of the mapped buffer. Could this be only a safety measure or does it improve performance?
Note: I'm not asking about glBufferSubData()/glBufferData. Those two are irrelevant in this case.
• By xhcao
Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness.
• By cebugdev
hi guys,
are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic
let me know if you guys have recommendations.
• By dud3
How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below?
Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.

References:
Code: https://pastebin.com/Hcshj3FQ
The video shows the difference between blender and my rotation:

• By Defend
I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
* make lots of VAO/VBO pairs and flip through them to render different objects, or
* make one big VBO and jump around its memory to render different objects.
I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?

• 9
• 14
• 12
• 10
• 12