I'm wondering if anyone has any information on how some of the lastest games handle their particle systems. So many of todays games have huge amounts of particle effects, which seem to spawn hundreds of thousands of particles every second. With todays GPUS, the answer seems obvious, have them run on the GPU with some sort of parametric function, or using stream out to flip flip between buffers. Both these solutions require no data to be passed from CPU to GPU and work very fast. The other slow method is to update a buffer on the CPU side and have that sent to the GPU before drawing.
But im wondering what the real Engines do? They all seem to have particle system editors that allow the artists to build up their particle Effects, through some kind of node based editor. This sounds like it could only work with the final method, as the CPU would be required to process the chain of events each frame for the effect created by the artist. Either that, or the developers some how manage to map those events to shader code.
Bit of both. Most decent particle effects combine several emitters, sometimes spawning new ones periodically, etc.
The CPU will be used to schedule and generally control the emitters, which are each individually fairly simple. Then you switch to GPU to actually move the emitters' particles forward a frame.
But you'd also be surprised how little you can get away with on a CPU-only system that uses billboards. Our current system for example I would class as advanced old school. ie emitters have lots of options but it's all controlled by the CPU, even moving the particles (to maintain compat with crappier systems). My own effects have always been a bit rammel, but since I did a decent editor and passed it over to an artist, some of the effects he got out of even this were jaw dropping.
Im talking about particle effects that are created with editors offerering all sorts of options though, many, many configurations. Im wondering if they just peform the update on the CPU where all the options can be rocesssed and send it down to a shader that draws it, or if they have some sort of configurable shader that does the whole lot.
Take the unreal editor for example, in their effect editor, you can stack up different processes for the emiter to perform.
I have no idea how the unreal one works, but it could use either method i described. Each bit of a complex effect can be a different emitter with a different shader, or it could be done on the CPU. In either case, the complex effect is just a combination of simpler individual ones on top of each other with different start times and parameters.