There are many different ways to depict destruction, but by far the most common these days are particle systems. Particles were something I've been putting off for a long time now, so this was as good an excuse as any to do some background research and knock something up. As with most of the stuff I do, I was looking for a way to make them as flexible and extensible as I could - ideally so that you can assemble new effects by hooking up a graph of objects.
My current design is something like this:
There's a Particle object which defines a type of particle. You attach a series of Behaviours to the Particle object to govern what the particle does (e.g. you add Decay to make it lose energy and eventually die, the MooveParticle behaviour adds Newtonian motion, Gravity makes it fall, Flock make them attract each other, Collide makes them test and collide with the scenery, etc). Then you add one or more Shapes to render the particle (e.g. a Sprite would give you billboard quads, a Mesh would give you solid particles, and I eventually hope to add a Fluid renderer which does some metaball rendering). The whole point of a particle system is that the individual particles themselves are very cheap - so this object acts more as a template or description. The actual particle data is stored in bulk somewhere else.
Once you have a Particle setup, you can attach it to a ParticleEmitter object, which will spawn and initialise them according to it's shape and settings. The emitter injects the newly created particles into a ParticleSystem, which is where all the particle instance data lives in a big honking stream of data which gets batch updated and batch rendered (it's the batch-ness which allows you to have bucket loads of them floating around).
One of the things I've seen in many particle systems that I don't particularly like is that they hardcode the particle data structure. This makes it hard to add new data that a given particle type might need, and means that even simple particles have to carry around all the data for the most complex ones. To address this, I've got "channels" of particle data (like position, orientation, energy, size, colour, etc), and the behaviours on the particle object will generate the list of particle data they require. The particle system then assembles a stream of exactly that data.
Here's the first prototype I got working. It's a simple explosion particle system using newtonian motion equations, with some dampening to simulate air resistance and some gravity. The particles are rendered with a simple billboard sprite whos colour is picked based on the energy left in the particle. The colour ramp cycles from yellow to red to clear black (the red gives the nice plasma type halo around the explosion core). I played around a bit with blending the particle rendering between modulative and additive blending, but in the end decided to stick with the additive rendering for a nice bit of colour burn.
There are some pretty nasty blending artefacts where the sprites intersect the scene geometry - and if anything, the mpeg compression actually hides them a bit. I know of at least one trick for fixing this, but it's going to need a much better rendering pipeline than the simple one I have right now, so for now, this will have to do. The explosions come and go pretty quickly so it's not like the player can sit and stare at the rendering artefacts for too long.
I want to add some secondary particle emitters for flares and smoke, but I'll need to tidy up a few of the temporary hacks I've been using to test with first and flesh out the emitter object a bit.