currently i would like to introduce particles into my deferred renderer, what i do:
- accumulate particles into off screen low res buffer,
- after drawing all objects into deferred buffer, "inject" accumulated particles into scene (output depth from particles into main buffer),
also use stipples during this process.
- resolve lights on this deferred buffer
- resolve final transparency for particles.
Yaay, now i have fully lighted, shadow receiving (not casting) particles with low effort but :
Because i'm using point sprites, normals always point in one direction, so basically when light comes from behind of particles, they aren't lighted at all. When light is in front of particles, everything is ok.
I would like to blur depth values in accumulated particles depth buffer and compute some surface normals to adjust particle normal somehow,
so particles will became volumes instead of sprites.
But for this i need to blur particle depth in linear space. I can do that with no problem.
But when i'm "injecting" particles, i need projection depth, so particles will be correctly occluded by objects and also lightning will work correctly.
Is there any way to convert from linear depth into projection depth with using camera parameters (near and far plane)?