Sign in to follow this  
Koobazaur

DX9 Uneven lighting

Recommended Posts

I am working with the fixed DirectX9 pipeline and the vertex lighting is producing some really shoddy results. That is, when one brush ends and another begis, the two surfaces are light completly differently giving some pretty bad looking results. Now, I understand why this happens (for instance in the picture, the floor in the hallway is narrower than in the room thus the vertexes are closer to light source and end up being brighter), but is there any way around this? Thanks.

Share this post


Link to post
Share on other sites
Quote:
Original post by perkbrian
try Diffuse lighting using shaders...


I have not yet learned how to use shaders and I wanted to hold off for a bit still, is there any way I can fix this problem without resolving to them?

Share this post


Link to post
Share on other sites
The link doesn't work.

Anyway, you could always use a flat normalmap and use DOT3 bumpmapping, which will give you per pixel lighting. With the added bonus of being able to use bumpmaps as well, of course ;)
No idea how you would go about transforming the light vector into tangent space without shaders, but I'm sure it can be done (hopefully without one draw-call per triangle).

Share this post


Link to post
Share on other sites
Oh, just remembered what I used to do. I calculated the light-vector for each vertex, transformed it into texture space, and stored it in the vertices.
As long as the light-source doesn't move you can simply transform the stored vector the same way as the normal. When the lightsource moves you have to recalculate it.

Share this post


Link to post
Share on other sites
I was looking into this and it looks like a very useful technique. Could you, by any chance, point me to some tutorials about bumpmapping/DOT3 ? I was googling it and all I could find were poorly documented source codes for a program at code-sampler. I could go ahead and steal that, but I'd rather know how the thing actually works :)

Share this post


Link to post
Share on other sites
If you check the papers on Nvidia's developer pages there are tons :)

I would really encourage you to get started with shaders though. Using HLSL and the D3DX Effect interface it's quite easy, and it allows you to do things like this MUCH easier.

Share this post


Link to post
Share on other sites
Issues with shaders aside, it looks to me like you simply aren't calculating your vertex normals correctly.

Share this post


Link to post
Share on other sites
Quote:
Original post by gharen2
Issues with shaders aside, it looks to me like you simply aren't calculating your vertex normals correctly.


well, I am using D3DXComputeNormals(mesh, 0)... maybe it screws up due to how the mesh is structured?

Share this post


Link to post
Share on other sites
Quote:
Original post by Koobazaur
well, I am using D3DXComputeNormals(mesh, 0)... maybe it screws up due to how the mesh is structured?


That's quite possible. Vertex normals are derived from the normals of the faces they're part of. And to get the normal of a face, you make assumptions about the order of the vertices that make it up. If the order is wrong, the normal is reversed, and hence you get what you're seeing: faces that should be dark are light, and vice versa

Assuming it is a problem with the structure of the mesh, the first place I'd look is the exporter you're using to create the X file.

Share this post


Link to post
Share on other sites
Quote:
Original post by gharen2
Quote:
Original post by Koobazaur
well, I am using D3DXComputeNormals(mesh, 0)... maybe it screws up due to how the mesh is structured?


That's quite possible. Vertex normals are derived from the normals of the faces they're part of. And to get the normal of a face, you make assumptions about the order of the vertices that make it up. If the order is wrong, the normal is reversed, and hence you get what you're seeing: faces that should be dark are light, and vice versa

Assuming it is a problem with the structure of the mesh, the first place I'd look is the exporter you're using to create the X file.


I am using DeleD. However, upon further examination, I don't think it is the problem. If a surface is displayed dark and I put the light closer to its vertex, it gets lighter, so the normal cannot be reversed. I think the problem stems from the fact that for the surfaces touching each other, the vertexes are at different distances from the camera and, thus, one is lighter and the other darker. Put them together and you get visible contrast.

Share this post


Link to post
Share on other sites
Quote:
Original post by Koobazaur
the vertexes are at different distances from the camera and, thus, one is lighter and the other darker.


Hmm, I'm not sure I understand. The distance of the vertex from the camera shouldn't affect how it's lit (unless that's a typo and you meant light, not camera).

But hey, if you fixed it, good going :)

Share this post


Link to post
Share on other sites
Actually, it wasn't a typo... see, in my program, I set the light to the camera, so camera = light :P

But no, I haven't quite fixed it... can this problem be easily overcomed when I move to shaders?

Share this post


Link to post
Share on other sites
I'm no expert on shader based normal mapping, so I'll have to defer to someone more experienced.

I can tell you that it'll make things significantly more complicated though. You'll have to create (or obtain) normal maps for each texture, and that's no simple matter.

edit- I just had a thought: try setting the normalisenormals render state to true (SetRenderState(D3DRS_NORMALIZENORMALS, true);)

Share this post


Link to post
Share on other sites
First of all, the normal map needed can be the same for each surface, namely a single, flat normalmap, with the normal pointing straight out.
Second, collapse the vertices so they share the normals. If the normals point in different direction you won't have consistent lighting.

Share this post


Link to post
Share on other sites
It looks to me like this isn't a problem with how you're doing your lighting (Although using shaders will look far better), it's a problem with the model.

Standard DX lighting only lights per-vertex, with the lighting for each vertex calculated by the distance to and angle from the light. Consider the terrible ASCII art below:


|-----------|
| |
| |
| * |
| |
| |
|-----------|


Where the star is the light and the box is your polygon. Now, in reality you'd expect the center of the polygon to be bright, but the edges to be dark. However, since it's only lit per vertex, and each vertex is the same distance away from the light, the whole polygon ends up looking flat-shaded since each vertex will have identical lighting values.

Now, what you're coming up against here is what's called a T-intersection. This is bad, because the point where the 2 polygons join doesn't have a vertex on both faces, so the lighting wont match up. What you have currently is this:


|-----|-----|
|\ | \ |
| \ | \ |
| \ |-----|
| \ |
| \| ^ Notice that the left polygon doesn't have a vertex where this corner is
|-----|


So when DX is getting the distance from each vertex to each light, it's not taking the center-right of the first polygon into account, and thus the lighting wont match up. To fix this, you'll have to add a vertex to the first polygon like this:


|-----|-----|
|\ | \ |
| \ | \ |
| \|-----|
| / |
| / | ^ Now both faces share the same vertex, and will share the same lighting values
|-----|


Or an even cleaner solution:


|-----|-----|
| \ | \ |
| \ | \ |
|-----|-----|
| / |
| / |
|-----|

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this