Particles in idTech 666

Started by
6 comments, last by REF_Cracker 7 years, 7 months ago

This is a beginner question, but in this slideshow when they talk about computing particle lighting...

http://advances.realtimerendering.com/s2016/Siggraph2016_idTech6.pdf

... where do the particle normals come from? Are the particle normals just the plane normals of the grid? Or is there something else going on, like a normal map from Houdini or maybe calculating a normal based on the particle being a sphere or the system being a cylinder?

I'm still trying to wrap my head around the basics of particle lighting!

Thanks,

Nick

Advertisement

the particles have a dedicated texture for normals.

I don't know about DOOM specifically, but many engines (including ours) will expose different options for what kind of normals you want. That way you can choose between flat normals, "round" normals, normal maps, etc. depending on what kind of particle system you're authoring. I'm not familar with the processes that our FX artists use for baking out normal maps, but I can ask them if you're curious.

Yes, absolutely that would be great. I am really baffled about the basics of particle lighting, I feel like I should probably start figuring out some off these offline tools.

Nick

There is also the bitsquid presentation on particle lighting which might interest you: http://roxlu.com/downloads/scholar/008.rendering.practical_particle_lighting.pdf

Sounds like a normals come from a normal map. But instead of rendering lighting for the pixel at full resolution they rendering lighting at a much smaller resolution. So, for a particle that covers 128x128 pixel region on screen, the engine will only render lighting for it at 32x32, 16x16, or 8x8, depending on how far the particle is from the camera. Then they composite (blend) that "light-map" on the full res particle.

BTW, this is a great breakdown of how an idTech6 frame is rendered:

http://www.adriancourreges.com/blog/2016/09/09/doom-2016-graphics-study/

Sorry, I forgot to follow up on this. I talked to our lead FX artist, and at our studio they often just generate normal maps directly from their color maps using a tool like CrazyBump. There are a few cases where they will run a simulation in Maya, in which cases they may have Maya render out both color and normal maps to use for the particle system.

I've been thinking about lighting our particles in this manner as well.
Anyone have idea on a fast way to:

- Allocate a location for the quad in the atlas. I can easily see how this would be done on the CPU. But lets say you want to do it while rendering out the atlas itself. Ideas?

- If you want to have a normal map for the particle you'd probably still have to render out something like the HL2 basis. They don't seem to do this ... it seem like the lighting resolution would be too low. Thoughts?

Thanks!

Check out my project @ www.exitearth.com

This topic is closed to new replies.

Advertisement