Jump to content
  • Advertisement


  • Content count

  • Joined

  • Last visited

Community Reputation

3307 Excellent

About FreneticPonE

  • Rank

Personal Information

  • Role
    Creative Director
  • Interests

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. There's even dithering in offline stuff. It ends up as a fairly standard thing to do.
  2. FreneticPonE

    Have you ever had your game idea stolen?

    Yes but I don't care. Me and another guy came up with modern survival games as a mod for Oblivion way back when. It rolled on up from there, great for them. We didn't do much of anything with it beyond that first half assed mod that, I can't even remember what we did with the mod itself. It's the work beyond that first idea that also counted here, the execution. Just look at Day Z King of the Hill, first modern battle royale multiplayer out there. But the devs didn't put in the work, didn't put in the thousand other ideas that really build upon the first to make it into something great. So now it's PUBG and Fortnight that are the winners, because they did do that. Don't get me wrong, the first initial idea is important. A lot of people, especially people that "do the work" for a living, vastly underestimate how important that founding idea is. You can put all the hard work you want into an idea but if it's stupid at it's base it's going nowhere. But you also do have to put in the work to get something out. You need both, so even if someone "steals" your idea and succeeds it's not like they didn't do any work themselves. So trust me when I say all your plans are going to look silly at some point. You're going to get partway into building the game and realize this thing isn't working, so you have to re-arrange this, and change that, and eventually it's a mess and you wonder how anyone ever ships a game at all. That's the work part. Just worry about getting over that part before worrying about how great your initial idea is.
  3. FreneticPonE

    DirectX12 adds a Ray Tracing API

    Welp, here's about all you need to know: https://www.remedygames.com/experiments-with-directx-raytracing-in-remedys-northlight-engine/ "Single Ray Per Pixel, 5ms @1080p on very high end graphics card, single sample termination, ambient occlusion with geometry sampling only" Which of course is really noisy so then you get to add denoising overhead on top of that. Oh and it's all static too. So, yeah performance is definitely not realtime for today and probably not tomorrow and next gen either. Really don't understand why DirectX needs it's own raytracing API in the first place.
  4. FreneticPonE

    Light Shafts

    This is a very, very old paper and is not recommended anymore. I'd take a look here: https://www.slideshare.net/DICEStudio/physically-based-and-unified-volumetric-rendering-in-frostbite and here: https://www.slideshare.net/BenjaminGlatzel/volumetric-lighting-for-many-lights-in-lords-of-the-fallen The second is one of the basis for modern techniques, the first has plenty of previous citations and links as well as a solid presentation
  5. Precision (t-junction) issues might be at fault. Is there a debug view showing the triangles to see if the cracks line up? Basically the internal mesh rendering of these specific GPUs could be throwing whatever precision it thinks is best at your triangles, altering them slightly due to quantization errors and allowing tiny gaps to bleed through.
  6. The Nvidia paper is... unreliable. Cone tracing is potentially fast, the problem is lightleak makes its hard to implement reliably. By cone tracing's nature the farther you trace the more lightleak you get. But the shorter a cone you trace the less light you get, overall it was an idea that seemed like the future two+ years ago but has since fallen out due to its weaknesses. There are a lot of other GI techniques that can be considered depending on your requirements. EG is the environment static, or highly deformable, or runtime generated? Does light need to move fast or can it move slowly (EG a slow time of day?). That being said Signed Distance Field tracing and some version of lightcuts/many lights looks like it could, potentially, do what cone tracing once promised in realtime. Here's a nice presentation on signed distance fields, which is essentially a sparse voxel octree from cone tracing but you "sphere trace" instead of doing a cone. Benefits therein being no lightleak. Lightcuts/VPLs/"Many Lights" would be other half of the equation. Here's a nice presentation from Square Enix, wherein the biggest cost they have in the test scene is their choice of "adaptive imperfect shadow maps" which is a really hacky and slow way to do what SDF tracing can do easier and faster.
  7. You don't need to do virtual texturing with one, master texture for the whole world. You'll need to do blending again but you can just use it almost exactly like traditional texturing, not worrying about texel density or disc space at all. The latest Trials game does this (at least what they're GDC presentation indicated)
  8. FreneticPonE

    Blending local-global envmaps

    Remedy also had voxelized pointers towards which probes are relevant where. Heck you could go a step further (or does Remedy do this already) and store a SH probe, with channels pointing towards the relevant probes to blend. It'd be great for windows and the like, blending relevant outdoor probes would be great there. You could even make the entire system realtime, or near to it. Infinite Warfare used deferred probe rendering for realtime GI, and Shadow Warrior 2 had procedurally generated levels lit at creation time. I seriously hope those are the right links, I'm on a slow public wifi at the moment so... Regardless a nice trick is to use SH probes with say, ambient occlusion info, or static lighting info or something, to correct cubemap lighting. This way you can use cubemaps for both spec and diffuse, and then at least somewhat correct it later.
  9. FreneticPonE

    Spherical Harmonics and Lightmaps

    Oof, I remember that second one. At that point more traditional pathtracing is just as fast or faster, doesn't have any missing data problems, and would probably use less memory as there'd be no multiple copies of the same data.
  10. FreneticPonE

    Spherical Harmonics and Lightmaps

    Cubemaps only offer low frequency spatial data, ultra low frequency no matter how much angular frequency they offer. Invariably the farther away from the sample, or rather if it's just behind a pole or something, the less correct the data will be no matter how high a resolution it is. Lightmaps are ultra high frequency spatial data, even if angular data is low frequency it can still be more correct than a cubemap, no matter how many tricks you pull. And SSAO only works with onscreen data, and only works for darkening things. Most modern SH/SG lightmaps are used to somewhat correct or supplement cubamaps.
  11. FreneticPonE

    Spherical Harmonics and Lightmaps

    Cubemaps are only sampled from one spatial point, maybe 2 or so if you're blending across. Say, an H basis lightmap would sample light from each texel. You just contribute whatever specular response you can from your spherical harmonics to help with the fact that the cubemap is almost certainly going to be some level of incorrect. For rough serfaces the entire specular response can come from the lightmap, and thus (except for dynamic stuff) be entirely correct, position wise. Doing all this helps correlate your diffuse color to your specular response, which will become uncorrelated the more incorrect your cubemaps become. BTW if you're curious I'd consider "state of the art" to be Remedy's sparse SH grid used in Quantum Break: https://users.aalto.fi/~silvena4/Publications/SIGGRAPH_2015_Remedy_Notes.pdf The idea is to voxelize your level into a sparse voxel grid, then place SH (or SG/whatever) probes in each relevant grid point. The overall spatial resolution is less than a lightmap, but it's much easier to change up the lighting in realtime, and uses the same exact lighting terms for static and dynamic objects. It might not seem intuitive, but having a uniform response for lighting across all objects gives a nice look compared to the kind of disjointed look you get out of high detail lightmaps being right next to dynamic objects with less detailed indirect lighting.
  12. Not randomly, there's patterns and etc. Here's a nice tutorial instead of explaining here. As for variance shadow maps, they're very fast at low resolutions but grow badly with it, 4x the res = 4x the cost. They also have lightleak which to fix you need to go down some infinite hole of ever longer papers. I'd definitely stick with PCF first and try playing around with an offset biases to clean up artifacts.
  13. FreneticPonE


    Ergh, Fallout 4 has a threshold and it's glaring. The results of a threshold for bloom with proper HDR values introduces weird, non physical results. The idea behind bloom is that you are simulating light going through and being scattered by a lens, which it does in real life as your eye has a lens (well, an entire lens stack really), the brighter the part of your vision the more light from it will be scattered by the lens and the more obvious it will be. So scattering the whole thing is correct for both your eye and a camera. And while you can just introduce a cutoff, it's rendering you don't have to do physically based anything, I found it a bit glaring and annoying in Fallout 4 for example, while the non cutoff method has never bothered me. At least personally.
  14. FreneticPonE

    FXAA and normals/depth

    Not exactly as such. For FXAA you're only blurring final colors. Basic idea is that it hides jaggies at the cost of being a bit blurry. For normals you do want AA, you want normals and roughness to correlate, so the normal map blends into your roughness. This will help prevent "sparklies" or bright shiny pixels you get from using HDR and reflective PBR materials. I know that's not the best explanation, but the full explanation is here. Code, demo, etc. etc.
  15. FreneticPonE

    FXAA and normals/depth

    Depth AA? What would this even be?
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!