Help with hlsl

Started by
7 comments, last by ongamex92 10 years, 2 months ago

This has been driving me mad for a while now...

Asuming the vertex data is created and fed correctly, shouldn't this HLSL code show my object in uniform white color?


struct VS_INPUT
{
	float3 posL	: POSITION;
	float3 normalL	: NORMAL;
};

struct VS_OUTPUT
{
	float4 posH	: SV_POSITION;
	float3 posW	: POSITION;
	float3 normalW	: NORMAL;
};

//--------------------------------------------------------------------------------------
// Vertex Shader
//--------------------------------------------------------------------------------------

VS_OUTPUT VS(VS_INPUT vsIn)
{
	VS_OUTPUT psIn;

	float4x4 mWVP = mWorld * mView * mProjection;

	psIn.posH = mul(float4(vsIn.posL, 1.0f), mWVP);

	psIn.posW = mul(float4(vsIn.posL, 1.0f), mWorld);
	psIn.normalW = mul(float4(vsIn.normalL, 1.0f), mWorld);

	return psIn;
}


//--------------------------------------------------------------------------------------
// Pixel Shader
//--------------------------------------------------------------------------------------
float4 PS(VS_OUTPUT psIn) : SV_Target
{
	return float4(1.0f, 1.0f, 1.0f, 1.0f);
}

Instead I'm just seeing the clear color background.

Advertisement

Yes it should. So your vertex data or your matrices must be incorrect, putting your object outside the view frustum. Or,

- your vertices have the wrong winding order (turn culling off to see if this is the problem)

- you're trying to draw try behind another object (or the depth buffer was cleared to the wrong value) and the depth comparison is failing (turn the depth buffer off to see if this is the problem).

Also, the w component of your normal should be 0, not 1.

(NOTE: my answer is incorrect... you need to use the mul intrinsic for matrix multiplication, as unbird points out below)

Not sure about the order of multiplication for your matrices, but thats one to double check. Also, while we are on the subject, I would recommend calculating the model-view-projection matrix CPU side instead of having to do it for each vertex in the vertex shader...

Yes it should. So your vertex data or your matrices must be incorrect, putting your object outside the view frustum. Or,

- your vertices have the wrong winding order (turn culling off to see if this is the problem)

- you're trying to draw try behind another object (or the depth buffer was cleared to the wrong value) and the depth comparison is failing (turn the depth buffer off to see if this is the problem).

Also, the w component of your normal should be 0, not 1.

Not sure about the order of multiplication for your matrices, but thats one to double check. Also, while we are on the subject, I would recommend calculating the model-view-projection matrix CPU side instead of having to do it for each vertex in the vertex shader...

Okay. Here is the thing. I had a working example of a simple cube model with color shading. This problem started happening while I was trying to implement lights, and I have no clue what I did to make it not show up at all. I know I changed the Vertex structure, input layout, fixed vertex data to match (basically replacing color with normals) added some shader variables

Okay. I figured out while I was typing this response. I checked everything 10 times, I just knew it would be something ridiculous like it always turns out to be. the "eye" coordinates were inside the cube.

Interestingly enough there is something odd about having that matrix multiplication in the shader code. It doesn't work (aside from being very inefficient). Tried changing the order combination, nothing. Works ok having calculations on cpu and passing WVP though.

Whatever it is, thanks for your help!

I could be wrong and it could be as you intend it to be but, it looks like your normal calculation will be wrong, you might want to change it to:


float4(vsIn.normalL, 0.0f)

note: the 1.0f changed to a 0.0f

This stop the matrix multiplication translating the vsIn.normalL value as it would a position value and only rotates as it should for a direction value.

I tried changing that. The results look the same. It depends what you do with it probably. To be completely honest I don't know if there is a difference in vec3 * mat vs. vec4 * mat as far as first 3 members are concerned. (I suck at maths)

If what you say is true then.. the more you know.

Interestingly enough there is something odd about having that matrix multiplication in the shader code. It doesn't work (aside from being very inefficient)


It doesn't work because * is a component-wise multiplication (aka modulate), not a matrix multiplication. For the latter you need to use mul(), just like with the vector-matrix multiply later.

As for the side topic here, when using a 4x4 matrix, always use a vector 4, even if the compiler/language allows the combination of 3/4 sizes it will help avoid confusing. You might want to do some reading on the construction of world matrices (scaling, rotation, and translation of the mesh) so that you understand why we use a 1.0/0.0 in the w component. This free course might be quite beneficial for you: https://www.udacity.com/course/cs291

Good Luck

Interestingly enough there is something odd about having that matrix multiplication in the shader code. It doesn't work (aside from being very inefficient)


It doesn't work because * is a component-wise multiplication (aka modulate), not a matrix multiplication. For the latter you need to use mul(), just like with the vector-matrix multiply later.

and take a look at this http://msdn.microsoft.com/en-us/library/windows/desktop/dd607354(v=vs.85).aspx

This topic is closed to new replies.

Advertisement