Jump to content
  • Advertisement

yoshi_lol

Member
  • Content Count

    30
  • Joined

  • Last visited

Community Reputation

1356 Excellent

About yoshi_lol

  • Rank
    Crossbones+

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. yoshi_lol

    d3d11 frustum culling on gpu

    You can take a look at http://www.wihlidal.ca/Presentations/GDC_2016_Compute.pdf
  2. yoshi_lol

    BRDFs in shaders

    If I had to recommend one book it'd probably be PBRT (click). If you just care about shader code you might aswell google for that. Plenty of code out there. Most notably the Unreal Engine 4 source code on GitHub (click).   My two cents about this: I remember spotting a bug in the PDF for their BRDF in their codebase so trying to decipher the math by looking at code is probably not the best way to tackle this. People make mistakes. Programmers make a lot of mistakes. The chance of running into something you don't understand simply because there's a mistake in there is much higher than for stuff that was written down on paper.
  3. yoshi_lol

    BRDFs in shaders

    1) This is part of "Radiometry". (click)   Irradiance is the amount of incoming light (or power) for a surface patch. You calculate it by collecting (integrating) all of the incoming lighting over the hemisphere of a surface + normal.   Radiance on the other hand is a directional property and is defined over what's called a solid angle. Basically, incoming radiance tells you how much light passes through a certain part of the hemisphere and hits the surface. Outgoing radiance is the opposite: how much of the reflected light is going out through a certain part of the hemisphere. People usually mean outgoing radiance when they don't say incoming or outgoing.   (Edit) About the plane-perpendicular-thing: so you have a light that is shining in direction L. Now imagine a plane with the normal -L. That plane is, by definition, perpendicular to the light. You can move that plane back and forth along the L vector and measure how much irradiance comes in from that light. What makes that plane so special? The light hits that plane straight on, so the cosine in the denominator (the bottom part of the fraction) would return 1 for that plane, i.e. 100% of the outgoing radiance from the light ends up as irradiance on the plane.   2) f = (Outgoing Radiance) / (Irradiance). So f (the BRDF) says how much of the incoming light (Irradiance) is reflected in a certain direction (Radiance). So the light comes in, it gets reflected and goes out. f tells you how likely that exchange happens for the two inputs. It's not a probability so f can actually be greater than 1 for certain view directions but the integral of f over the unit hemisphere should never be greater than 1 - otherwise the BRDF would violate energy conservation (creating new energy).   3) The Radiometry chapter is usually not that long for computer graphics. I hope that no one here will tell you to not read it.
  4. yoshi_lol

    If you had a magic button, what would it do?

    I would want it to close this thread.
  5. yoshi_lol

    Quadro K1000M

    1. In the NVIDIA System Control panel which can be found 2. here.   /closed
  6. Great, looking forward to dig into that!   (Off-topic) About the Maya integration - what are your thoughts on that in hindsight? I only dabbled in Mel/Py to do small tasks. I'm assuming most of the heavy lifting was written with the Maya Cpp SDK?   Also: you two should feel free to join the gd.net chat from time to time if you find yourself yerning for an extremely long and one-sided conversation about this. Milk and cookies are ready...at dawn. 
  7. yoshi_lol

    Volumetric Lighting with SDSMs

    Yeah, I'll look into it after trying the second part of 1). Appreciate your opinion on this.
  8. Thanks for clearing that up David! I didn't even know you were on GD.net - giving too much credit to MJP here. :)   I might have some follow up questions about the ASG later - I don't have the slides open atm.   Not really related to the talk: did you bake the lightmaps inside Maya (with something like a custom mental ray version) or did you use a raytracer outside of Maya (Embree, Optix)?   Congratulations to you and your team on the (imo) best-looking game out there.
  9. I'm going through the slides of our MVP MJP about spherical gaussian lightmaps in The Order 1886 but I'm having trouble following the talk in a few places without additional commentary so I'll just quickly try to recap of what I think is going on:   Link: https://readyatdawn.sharefile.com/share#/download/s9979ff4b57c4543b   1) A set of random, uniformly spaced directions is picked for the SG (e.g. 9). They are hard-coded inside the shader + 1 width value that is shared among all the SGs (27 floats + 1 float constant). The lightmap only stores the color (float3) for each of the 9 directions (27 scalars per lightmap texel).   Question #1: what's the reasoning behind the golden ratio spiral argument? (Slide 43)   2) When converting the GGX NDF to 3 SGs: what does the approximation (equation) look like? Assuming it looks like this (leaving out details): A * e^p + B * e^q + C * e^r.   Question #2: how do you rotate this function to the light direction? Does p = q = r?
  10. yoshi_lol

    Volumetric Lighting with SDSMs

    Do you mean stopping after the (log-)partitioning and using those partition bounds to render the shadow maps? If so, that's basically what I meant in 2), maybe I phrased that poorly.
  11. Looking for ideas on how to combine Sample Distribution Shadow Maps with ray-marching based volumetric lighting. I can't just sample the SDSM cascades since they only contain valid information for the fragments being shaded, not arbitrary points in 3D-space like regular CSMs.   I could only come up with two "fixes" that sound kinda terrible: 1) render another standard shadow map covering the entire frustum or 2) widen the bounding boxes of the cascades to include the entire 3D space, essentially trading shadow map quality for arbitrary sample locations.   Tried 1) already and it simply doesn't work well when the near/far distance goes up. I haven't tried limiting the shadow map to the max ray distance yet though.   Haven't tried 2) yet. Trading shadow map quality sounds pretty bad though.
  12.   I think this point in particular is extremely important and should be emphasized more - not necessarily here, but in general. Exceptions are not meant to replace a set of "unusual" return values like some Java exceptions would like you to believe. TimeoutException for example does not sound like an exception to me. Not being able to open a configuration file on the other hand does. If you think that your application is in a fubar state then exceptions can be a perfectly good way to signal that while keeping in mind all the details that SmkViper already gave.
  13. yoshi_lol

    orthonormal rotation and a quaternion bijection?

    Well, you need to look at the trace of the matrix when going from matrix to quaternion, and there's an invsqrt in there. If that equals zero then you are in big doo-doo-land of course, but it can also be near-zero and become unstable (because of floating-point errors) so you can do what JMP van Waveren did for Doom 3 and look for the largest component divisor first (click).
  14. Basically two options: measure the time for a given task yourself or use external profilers / tools like Intel VTune which can help you find hotspots and it shows you some CPU counters that can also guide you in optimizing your code.   You can use the C++11 std::chrono library to measure time or use an OS function like QueryPerformanceCounter directly.   The two methods are not mutually exclusive. You can (and should) do both.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!