Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


We're also offering banner ads on our site from just $5! 1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


n00body

Member Since 20 Oct 2006
Offline Last Active Sep 28 2014 08:20 AM

Topics I've Started

[SOLVED] Area Light, Representative Point, Blinn-Phong

26 August 2014 - 08:10 PM

Background

Okay, so I have seen Epic's paper that covers, among other things, how they did area lights in UE4 by the "representative point" method:
http://blog.selfshadow.com/publications/s2013-shading-course/karis/s2013_pbs_epic_notes_v2.pdf

 

Problem

They specifically talk about how they needed to come up with a modification to the normalization factor for their BRDF to make the intensity looks close to correct. However, the one they list is for GGX, and I am using Normalized Blinn Phong. Sadly, I don't understand the math behind normalization.

 

Question

Is there a nice simple way for me to find the normalization factor to use Normalized Blinn Phong with this area light trick?

 

Thanks. 


Toksvig AA and normal map precision issues

23 January 2014 - 08:23 PM

Background

I am working on a physically based shading system in Unity. Among the many components needed by this system is way to compensate for specular aliasing. Our chosen method involves calculating Toksvig AA from our normal maps, pre-correcting our specular powers, and storing the result for use at runtime.

 

Problem

We are dealing with ARGB32 precision normal maps as inputs, and are encountering the problem that some of our artists tools don't quite output unit length normals in this precision. Unfortunately, this has dramatic consequences when applied to our pre-correction of specular powers, even for "flat" normals. So materials that are supposed to be smooth end up getting dramatically more rough.

 

Questions

1.) Is there a way to deal with this besides just using higher precision storage for our normal maps? 

2.) One of our tactics to try to deal with this problem is offering the artist a bias factor that adjusts the calculated variance down slightly to compensate. Is this a valid solution, or will it break horribly down the line?

3.) Is there some other trick I could try that I have overlooked?

 

Thanks for any help you can provide.


[SOLVED] HDR PBR, exceeded FP16 range

09 January 2014 - 10:00 PM

Background

I've been developing a physically-based shader framework for Unity3D. I am using the Normalized Blinn Phong NDF, and an approximation to the Schlick visibility function. I am using an effective specular power range of [4,8192] for direct illumination. I have also developed a translucent shader that uses premultiplied alpha to only make the diffuse translucent while preserving the specular intensity based on fresnel.

 

For all my testing, I am doing everything in Linear HDR mode which affords me an FP16 render target for my camera.

 

Situation

So this is a highly contrived scenario, but my team's artist managed to make it happen. Basically he has a scene with a directional light whose intensity is effectively 1.0 (0.5 input for Unity)  shining on a glass bottle surrounding a smooth metallic liquid. As a result, the two substances' highlights overlapped and their combined intensity seems to have exceeded the range of the FP16 render target. This resulted in weird artifacts where the the highest intensity color component went to black, while the other two just looked really bright. (see example image below).

 
Attached File  ExceededPrescisionOfHDR.jpg   97.5KB   3 downloads

 

Upon further testing, I found I could remove the artifact by making the surface more rough, thus reducing the intensity of the highlight. However, I still found it having this visual error for even relatively rough overlapping materials.

 

Questions

1.) Is there any way to prevent this from happening programmatically without having to clamp the light values to an upper limit or otherwise harm the visual quality?

2.) Is it just something that falls to the artist to avoid doing?

3.) Even so, this means that I can't have multiple overlapping translucent objects or have to be careful about what objects pass behind them. Am I missing something here?

4.) Just for future reference, what is the actual upper limit value of FP16?

 

Thanks for any help you can provide.


[SOLVED] For Physical BRDFs, why use such high specular powers (eg. 8192)?

20 December 2013 - 06:02 PM

Background

So I have been looking into physically-based BRDFs for some time now, and have observed a perplexing trend. The companies that post the papers always seem to use a ridiculously high specular power (eg. Treyarch for COD BO uses 8192 for lights, 2048 for environment maps) when using a Normalized Blinn Phong NDF. This seems strange to me, especially since I have read that these powers are for extremely rare, super-polished mirror surfaces. Also, it means that they have to store a high-res mip level in their prefiltered environment maps that will take up a lot of space but almost never get used.

 

Up until now, I have usually settled for a maximum spec power of 1024 for my analytical lights, and 256 for my environment maps. For my bloom, I have been using a threshold of 1.0 and additive blending (what Unity provides). 

 

Problem

Whenever I test these specular powers myself, it causes the resulting specular highlight intensity to shoot up so high that my bloom goes crazy. In particular, as my camera moves I get very distracting flashes of bright spots on sharp model edges that appear and disappear in a very annoying way. 

 

Questions

1.) What is the incentive for using such high powers?

2.) Am I doing my bloom theshold wrong, and should I move it up?

3.) Is this a consequence of the Bloom blending mode?

 

Any help would be most appreciated.


Order of post-processing passes in pipeline?

20 February 2011 - 10:41 AM

Background:
For my rendering pipeline, I am using encoded RGBA8 buffers wherever possible. Currently, I am considering this order for all the passes:
  • G-Buffer prepass
  • Light accumulation
  • SSAO
  • Material (output motion vectors with MRT)
  • Refractive (written to a separate buffer, then overwrites material buffer)
  • EdgeAA
  • Depth of Field
  • Motion Blur
  • Translucency (accumulated in a separate buffer, applies its own fog)
  • Fog (also applies translucency buffer)
  • Generate Bloom
  • Apply Bloom, Tonemapping, & Color Correction

Problems:

Right now, I have concerns about ordering post-passes that involve blurring (Motion Blur, Depth of Field, EdgeAA, etc) or that overwrite other parts of the buffer (volume lighting, bloom, translucency, etc).
  • Blur passes introducing artifacts into each other
    • ex. Motion Blur then Depth of Field blurring the streaks
  • Overwrite passes happening before blur passes
    • ex. Translucency incorrectly blurred
    • ex. Fog blurring artifacts with EdgeAA
  • Overwrite passes happening after blurring passes
    • ex. Translucency being sharp on blurred background
Questions:
Any insights from prior work would be most appreciated.
  • Can you comment on my current proposed ordering of passes? Any potential problems, artifacts it will introduce?
  • Can you offer you own proposed ordering? Any potential problems, artifacts it will introduce?
Any help you can offer would be most appreciated. Thanks. ;)

PARTNERS