Jump to content

  • Log In with Google      Sign In   
  • Create Account


Hodgman

Member Since 14 Feb 2007
Offline Last Active Today, 06:49 AM

Posts I've Made

In Topic: Default light ambient, diffuse and specular values?

Today, 06:43 AM

There's no answer to this, it's a complete hack, for artists to play with.

It makes no physical sense to begin with. Ambient value on a light source is the amount that the light affects every object everywhere from every direction - Jesus photons. The diffuse/specular light scales say how bright the light is for refractions/reflections respectively; you can emit a photon, wait to see if it's firs event is a refraction (diffuse) or a reflection (specular) and then change the intensity of your light source after the fact.

If you're trying to emulate another program that uses this completely fictional lighting model, then the answer is - the same values that the artist was using in that program.
If you don't know, my advice would be for per light ambient to be very low or zero, and per light diffuse/specular to be equal.

If you're not trying to emulate another program, then you're free to choose a more sensible lighting model. In such cases, art is typically made specifically for a particular game, and artists will preview there work within that game to tweak the light/material values appropriately.

If you just want to see the shapes of models clearly, I'd try the also-completely-fake half-Lamber diffuse model, with 2 light sources of contrasting colours - e.g. A pink and a teal directional light comin from top-left and top-right. The half-Lamber model ensures the gradients wrap all the way to the back, avoiding the flat look that plain ambient gives you.

Moser physically based lighting models do not have seperately ambient/diffuse/specular ratios per light because as above, it's nonsense; they just have a single colour/intensity per light, and then the interesting ratios are part of the materials.

In Topic: DirectX to OpenGL matrix issue

Today, 06:30 AM

Off topic from your actual problem, but there's one slight difference in D3D/GL matrices - D3D's final NDC z coords range from 0 to 1, and GL's from -1 to 1.
So without any changes, your game will end up wasting 50% of your depth buffer precision in the GL version. To fix this, you just need to modify the projection matrix to scale in z by 2 and offset by -1 so you're projecting into the full z range.

If you're using a GL-oriented math library, it will construct it's projection matrices like this by default, so you'd have to make the opposite scale/bias z adjustments to get D3D to work (the error would be a misbehaving near-clip plane, appearing too far out).

In Topic: Bad code or usefull ...

Today, 06:14 AM

I learned this trick from Washu ... That is a bit verbose and tricky to read, but other than that I think it has all the nice features one would want, including being correct according to the C++ standard.

Thats really cute, but I'd hate to see the code that's produced in the hypothetical situation where the original implementation-defined approach isn't valid and/or the optimizer doesn't realize the equivalence...

I would however add a static assert to be immediately notified about the need for intervention, probably something like that: static_assert(sizeof(Vec3) == 3 * sizeof(float), "unexpected packing");

Definately. When you're making assumptions, they need to be documented with assertions. The OP's snippet is implementation-defined, so you're making assumptions about your compiler. You need to statically assert that &y==&x+1 && &z==&x+2 (or the sizeof one above is probably good enough), and at runtime assert that index>=0 && index<3.

In Topic: Bad code or usefull ...

Today, 06:07 AM

On the dozen-ish console games that I've worked on, they've all gone beyond the simple Debug/Release dichotomy.
The most common extension is-
• Debug -- all validation options, no optimization. Probably does not run at target framerate. Extra developer features such as a TCP/IP file-system, memory allocation tracking, intrusive profiler, reloading of assets/code/etc.
• Release/Dev -- as above, but optimized. Almost runs at target framerate.
• Shipping/Retail -- all error handling for logic errors is stripped, all developer features stripped, fully optimized (LTCG/etc). Runs at target framerate or above.

There's often more, such as Retail with logging+assertions enabled for QA, or Dev minus logging+assertions for profiling, etc...
Point is that every frame there are thousands of error checks done to ensure the code is valid. Every assumption is tested, thouroughy, and repeatedly by every dev working on the game and every QA person trying to break it.

When it comes to the gold master build, you KNOW that all of those assertions are no longer necessary, so it's fine to strip them. Furthermore, they're just error-detection, not error-handling, so there's no real use in shipping them anyway 8-)

In Topic: Good reading material on different BDRF's

Yesterday, 07:04 AM

Maybe - http://digibug.ugr.es/bitstream/10481/19751/1/rmontes_LSI-2012-001TR.pdf


PARTNERS