Jump to content
  • Advertisement
Sign in to follow this  

3D Depth Transistion Map for particle rendering

Recommended Posts

Posted (edited)

I'm looking at the follow paper which describes low res particle rendering in Destiny.

http://advances.realtimerendering.com/s2013/Tatarchuk-Destiny-SIGGRAPH2013.pdf

The paper mentions rendering a separate buffer besides particle color that contains the start and end depth at which a pixel goes from  transparent to fully opaque. Getting the start depth is just a matter of getting the resulting depth buffer since I'm rendering particles from back to front. What I'm not sure on is how to get the depth value for full opacity. Is there a generally accepted way of rendering  this value? I'm wondering if i would need to use some kinda of UAV buffer operation when rendering particles to get the final depth value.

 

pic1.png.5e2b21880469a9ed96a4b6916b6e46a6.png

 

pic2.png.92d5b53d10c1c8da10a2a1697c84a48d.png

 

pic3.png.c6fa90d80b151640923a58e6122a4edc.png

Edited by Keith P Parsons

Share this post


Link to post
Share on other sites
Advertisement

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By mmmax3d
      Hi everyone,
      I would need some assistance from anyone who has a similar experience
      or a nice idea!
      I have created a skybox (as cube) and now I need to add a floor/ground.
      The skybox is created from cubemap and initially it was infinite.
      Now it is finite with a specific size. The floor is a quad in the middle
      of the skybox, like a horizon.
      I have two problems:
      When moving the skybox upwards or downwards, I need to
      sample from points even above the horizon while sampling
      from the botton at the same time.  I am trying to create a seamless blending of the texture
      at the points of the horizon, when the quad is connected
      to the skybox. However, I get skew effects. Does anybody has done sth similar?
      Is there any good practice?
      Thanks everyone!
    • By 4d3d
      Hi there,
      I've been away from 3d Art whilst on Maternity leave and just started to get an hour a day (if i'm lucky) to model while my baby sleeps. This is also my reason for picking something small. Really i'm after some feedback, good or bad, on any improvements, tips on rendering etc. 

      Any feedback would be massively appreciated as my time is so precious at the moment that i don't often have time to watch tutorials and research techniques so anything to point me in the right direction would be great.

      I've baked down from High-poly and exposed some custom color changing, decals, and number plate naming from substance designer and imported to marmoset.



    • By lucky6969b
      Dear folks,
      How do I calculate the axis of rotation between 2 vectors, one of them is the source directional vector, and the second is the destination directional vector.
      Thanks a lot
      Jack
    • By mmmax3d
      Hi everyone,
      I would need some assistance from anyone who has a similar experience
      or a nice idea!
      I have created a skybox (as cube) and now I need to add a floor/ground.
      The skybox is created from cubemap and initially it was infinite.
      Now it is finite with a specific size. The floor is a quad in the middle
      of the skybox, like a horizon.
      I have two problems:
      When moving the skybox upwards or downwards, I need to
      sample from points even above the horizon while sampling
      from the botton at the same time.  I am trying to create a seamless blending of the texture
      at the points of the horizon, when the quad is connected
      to the skybox. However, I get skew effects. Does anybody has done sth similar?
      Is there any good practice?
      Thanks everyone!
    • By ChenMo
      I am implementing baking in our engine.I met a problem about how to assignment per object uv in lightmap atlas.
      I am using UVAtlas to generate lightmap uv, most unwrapped mesh has a uv range [0, 1), no matter how big they are. I wanna them have same uv density so to packing them well in lightmap atlas.I have tried thekla_atlas to do the same thing too, but it seems that it can not unwrap uv according mesh size.
      As far as I can see, unwrapping uv coordinates using its world space can solve this, all meshes share a same scale.But I don't hope to spend a lot of time to write these code and debug them.I am wandering is there exist some methods I don't know that can scale each lightmap uv to a same density.
      Thanks in advance. : )
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!