Sign in to follow this  
racarate

Particles in idTech 666

Recommended Posts

This is a beginner question, but in this slideshow when they talk about computing particle lighting...

 

http://advances.realtimerendering.com/s2016/Siggraph2016_idTech6.pdf

 

... where do the particle normals come from?  Are the particle normals just the plane normals of the grid?  Or is there something else going on, like a normal map from Houdini or maybe calculating a normal based on the particle being a sphere or the system being a cylinder?

 

I'm still trying to wrap my head around the basics of particle lighting!

 

 

Thanks,

Nick

Edited by racarate

Share this post


Link to post
Share on other sites

I don't know about DOOM specifically, but many engines (including ours) will expose different options for what kind of normals you want. That way you can choose between flat normals, "round" normals, normal maps, etc. depending on what kind of particle system you're authoring. I'm not familar with the processes that our FX artists use for baking out normal maps, but I can ask them if you're curious.

Share this post


Link to post
Share on other sites

Yes, absolutely that would be great.  I am really baffled about the basics of particle lighting, I feel like I should probably start figuring out some off these offline tools.

 

Nick

Share this post


Link to post
Share on other sites

Sounds like a normals come from a normal map.  But instead of rendering lighting for the pixel at full resolution they rendering lighting at a much smaller resolution.  So, for a particle that covers 128x128 pixel region on screen, the engine will only render lighting for it at 32x32, 16x16, or 8x8, depending on how far the particle is from the camera.  Then they composite (blend) that "light-map" on the full res particle.

 

BTW, this is a great breakdown of how an idTech6 frame is rendered:

http://www.adriancourreges.com/blog/2016/09/09/doom-2016-graphics-study/

Edited by DANNER

Share this post


Link to post
Share on other sites

Sorry, I forgot to follow up on this. I talked to our lead FX artist, and at our studio they often just generate normal maps directly from their color maps using a tool like CrazyBump. There are a few cases where they will run a simulation in Maya, in which cases they may have Maya render out both color and normal maps to use for the particle system.

Share this post


Link to post
Share on other sites

I've been thinking about lighting our particles in this manner as well. 
Anyone have idea on a fast way to:

- Allocate a location for the quad in the atlas. I can easily see how this would be done on the CPU. But lets say you want to do it while rendering out the atlas itself. Ideas?

- If you want to have a normal map for the particle you'd probably still have to render out something like the HL2 basis. They don't seem to do this ... it seem like the lighting resolution would be too low. Thoughts?

Thanks!
 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this