• Content count

  • Joined

  • Last visited

Community Reputation

3293 Excellent

About FreneticPonE

  • Rank

Personal Information

  • Interests
  1. DX11 Dynamic ibl

    Not really, this leads into the "local lighting" infinite bounce trap. Light won't "travel" throughout the level correctly unless you iterate over every single cubemap therein, which you don't really want to do. So you end up with pockets of extreme brightness where the light bounces around next to ones of extreme darkness. You also have iteration time lag, so when you start it's very dark and the longer you hang around the brighter (though less exponentially) it gets as each iteration bounces more light. Still, it can be very annoying looking, as there's a literal "lag" to light and it's travelling very slowly somehow. The general idea is doable however! The only full shipped version I'm aware of is Call of Duty Infinite Warfare with their Fast filtering of reflection probes and the Rendering part. There's several strategies you could choose from, but all of them ditch the idea of taking the previous cubemap lighting results and re-applying them infinitely and recursively. One is only using local and sun light for lighting each probe at runtime. You'd only get one "bounce" but you could render and ambient light as well. Another is rendering the ambient term into the reflection probes, then just using the reflection probes for the final pass and no ambient there. But this can lead to odd colorbleeding results that don't look good. A hack could be as so: Light your cubemap with an ambient term, take the resulting hdr cubemap and re-light the original, unlit cubemap with it once. This should provide an approximation of multiple light bounces and smooth out any weird color/lightbleeding artifacts that come from doing only one "ambient" bounce. As long as you smoothly blend between cubemaps for both spec/diffuse I'd suspect there wouldn't be much "boundary" artefacts where inappropriate dramatic lighting changes happen. That being said check out the rendering parts separate spherical harmonic ambient occlusion like term. The idea is to take a higher resolution, precomputed sample of global illumination results. And then where that differs from the sparser cubemap information bake the difference into a greyscale spherical harmonic, so naturally dark areas don't get lit up inappropriately because the cubemap isn't correct, and vice versa. It's a hack, but an effective one. Edit - The Witcher 3 also does some sort of dynamic cubemap thing. But I'm not entirely sure how it works and I don't think they ever said.
  2. Depth-only pass

    Eh nevermind.
  3. Depth-only pass

    I doubt they do that anymore, Z-pre passes have been phased out after the transition to modern consoles. It was worth it on the previous generation because polycount could scale far better than memory bandwidth, but that's no longer the case today.
  4. This would depend entirely on the studio and project. If your engine uses approximate ACES, you may well want any art production viewports to use the same so as to give your artists a more accurate in game preview. Then again maybe you don't care, or you're expecting to create custom curves and etc. as you go along and so don't have a necessarily accurate way to preview it anyway. The best thing to do really depends on your specific situation.
  5. R&D [PBR] Renormalize Lambert

    Aye, Hodgman has the right of it. My personal favorite diffuse term comes from Respawn and Titanfall 2. They got the diffuse to not only be energy conserving, but reciprocal to GGX, as well as heightfield corellated and blah blah blah reference tested physically based etc. Take a looksie
  6. I'm confused about why you'd want to cast rays over an entire sphere. Unless you're sampling translucency lighting should only be incoming over a hemisphere, sampling over the entire sphere would produce overdarkening artifacts, IE shadows being cast onto objects from behind them.
  7. The black edges look like you're hitting out of screenspace without falling back on any other reflection, such as whatever it is the normal game does.
  8. IBL Diffuse wrong color

    What's the point of PBR if you're not using an HDR pipeline?
  9. Horizon:zero Dawn Cloud System

    Yaaaaaaaaaaayyyy!* Looking forward to it *Excited Kermit voice Related, does anyone know how Reset does their variable distance volume marching for their precipitation effect/large scale god rays? EG: Near camera volume marching is fine, you set your z-slices and go. Clouds are fine if they're on a defined plane/sphere. But they also seem to use such to represent rain, a very cool effect. I ask because the most dramatic scattering, or god rays, often comes between clouds/distant mountains. And somehow that seems to be accomplished here. A logarithmic march? If everything (including the clouds) was in some sort of shadowmap like depth buffer you could get a summed area table, then do a gradient domain like march. EG skip space till sunlight, etc. Any other ideas?
  10. Deferred texturing

    That would probably be because CLEAN mapping doesn't take ani-stropic filtering into account, so error will increase as you get parallel to the surface, IE more towards the horizon for your water. And if you want a non deferred texturing solution to better tessellation culling (not tessellating non visible triangles) run an async compute pass of the geometry, then discard non visible triangles before tessellation stage. Regardless, looking forward to the demo, the crap public wifi I'm on definitely won't be downloading it in any reasonable time but I'll check it out later. Glad your performance is going up!
  11.   Performance is also super  :( 47ms on a 1080 (at 1080p) for anything close to a real game outside a "single room" type scenario. And it doesn't support dynamically moving objects. It's the classic "hey, over a defined sphere complexity grows more 2 dimensionally than 3 dimensionally!" Global Illumination trap. As long as you keep things local GI is actually easy right? Infinite bounces, no lightleak, easy easy easy. Then you realize the whole "global" part is important, that most games aren't rendering a cornell box, and the farther away your calculate GI the more your performance starts to tank geometrically. 
  12. Ah, well then here is a more step by step with example code. DICE's stochastic SSR is more of a "what you can do once you've got the basics up and running. Hope that helps.
  13. Nope, it's all screespace raytracing. There's a dozen variations on sampling but it's all the same basic thing. Why, any goal in particular?
  14. It's certainly some sort of virtual 64bit precision, two 32bit depth buffers or some such thing. A single 32bit buffer wouldn't give any benefit, it's already common to flip it for more precision over a range. And native 64bit is sloooooow even on most "professional" cards, let alone average consumer ones. Star Citizen does virtual 64bit, though I never learned the details I'd bet it's pretty much the same.
  15. Deferred texturing

    Oh! In that case you want to do something completely different. What you want to do is actually simplify your ocean rendering as it gets farther away, tessellating less and less, and letting normal maps, and eventually a brdf take over. Take a look here: There's other papers on the same thing, including LEAN mapping and etc. that take on similar problems.