[D3D9/HLSL] Simple diffuse light shader...

Started by
25 comments, last by Buckeye 14 years, 8 months ago
Looks like a problem I had when I was using X files and I had incorrect adjacency data before asking D3DX to compute normals for me. You should either dig into how lib_3ds_ gets its adjacency data to compute those normals for you or compute them yourself.

EDIT:

I had a look at your FX file again and saw some possible mistakes:

1. Try changing the code that transforms your normal to world space in the vertex shader as follows:

Out.Norm = normalize(mul(float4(Normal, 0.0f), matWorld));

2. Renormalize the normal upon entry into the pixel shader:

Norm = normalize(Norm);

3. Compute color as follows:

return ambient + (diffuse * max(dot(-Light, Norm), 0.0f));
Advertisement
Quote:how do I call D3DXComputeNormals when I don't have an LPD3DXBASEMESH?
You can't[ATTENTION][SMILE] Bad assumption on my part.

You'd have to duplicate the function based on how you store the model information. If you want smooth normals (normals which will "smooth" the rendering between adjacent faces), that's a huge pain. You need (or would have to generate) adjacency information for the model.

Computing "flat" normals for each vertex:
for each triangle: // 3 vertices in CW order, vertex0 through vertex2   vector1 = vector( vertex1-vertex0 );   vector2 = vector( vertex2-vertex0 );   normal = crossproduct(vector2, vector1);   normalize(normal);   vertex0.normal = vertex1.normal = vertex2.normal = normal;

If that doesn't work, try normal = crossproduct(vector1, vector2). I'm getting old and can't remember everything.[WINK]

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Because I am not good at C++, here's the pseudo code to make a mesh... this is a brute force method. There may be another way to do this... but this guarantees the model's data is accurate:

1) Instantiate a mesh = Mesh.Box(). We'll call it oMesh.
2) Call ComputeNormals() on the oMesh object
3) DX defaults to Position and Normal, we need to add textured
4) Instantiate a second mesh with the params NumberFaces, NumberVertices equal to oMesh's property values. Declare the mesh with Position, Normal and Textured as the vertex format. We'll call this mesh object oFinal.
5) Lock the Vertex Buffers of both mesh objects
6) Loop through the arrays of data results from locking the Vertex Buffers setting the vertex of oFinal equal to the data of oMesh. And then including your Texture data.
7) Once done looping, Unlock both vertex buffers
8) Lock the index buffer of oMesh to get the index array
9) Call oFinal's IndexBuffer's SetData method to set the index buffer's data
10) Dispose and release oMesh, return oFinal

Then, in your render loop, render oMesh normally. You don't have to setFVF or anything. This should work. Clear as mud?
Enoch DagorLead DeveloperDark Sky EntertainmentBeyond Protocol
Okay, yeah, changing to face- instead of vertex normals worked. I didn't have to calculate them, there's a lib3ds function for doing that, too. I'm actually happy with this, I guess it was with the way lib3ds gathers its adjacency data and does the vertex normal calculations, probably due to the difference between left- and right-handedness.

A next step would be to go back to the, for most cases better looking, vertex normals and actually implement that awful conversion properly.

Thanks for all those who replied and helped me resolve this.
Quote:Original post by Buckeye
[snip...]
Also, you may need to "unwind" the vertices (which, from your data above, appear to be CCW rather than CW) either by changing the order of the vertices for each face, or changing the indices for each triangle to render then CW (which DirectX expects). If you unwind the vertices, do that before you compute normals.


Normal winding order for DX is CCW, for gl its CW.

All interpolated values, values you pass from vertex to pixel shader, should be renormalized. The interpolation doesn't garantee that a vector stays normalized.

You can safely multiply a float4x4 and a float3 only the left upper 3x3 part will be used for the transformation.

Next what you want to do is proper per pixel lighting by actually calculating the direction to the light in the vertex shader. Be aware that you will have to output your vertex position twice as you can't read from POSITION in the pixel shader.

Worked on titles: CMR:DiRT2, DiRT 3, DiRT: Showdown, GRID 2, theHunter, theHunter: Primal, Mad Max, Watch Dogs: Legion

Okay, so values that I send from vertex- to pixel shader through a TEXCOORDX binding are interpolated?

I remember using CG and there I had to exploit the COLOR binding because that gets interpolated between the vertices, I think.

EDIT: Just realized, OF COURSE do TEXCOORDX interpolate, after all that's how texture coordinates work, sorry.
Quote:Normal winding order for DX is CCW, for gl its CW.

Nope. For the default configuration, DX culls CCW. gl culls CW.[WINK]

From the docs for GetRenderState:
Quote:D3DRS_CULLMODE Specifies how back-facing triangles are culled, if at all. This can be set to one member of the D3DCULL enumerated type. The default value is D3DCULL_CCW.

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

This topic is closed to new replies.

Advertisement