Sign in to follow this  

Strange Performance Increase

This topic is 2662 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi all,

I've a Phong-model lighting shader /w one vertex and pixel shader pair.

In vertex shader, I was performing normalization of normal vector. But when I moved the normalization process to pixel shader, a strange FPS increase (57 to 69 on a scene contains approx. 3.1 Million polygons) occured.

Is that normal? Why or not?

Share this post


Link to post
Share on other sites
How many pixels did get shaded? How does that compare to the 3.1 million vertices you described? (a full 1920x1080 screen has 2,073,600 pixels).

It sounds like you were Vertex Shader bound and moving the operation to the Pixel Shader reduced the load. That's more or less reasonable.

Share this post


Link to post
Share on other sites
Here's my vertex shader, btw:

void VS_ ( in float4 pos: POSITION0,
in float2 texC : TEXCOORD0,
in float3 tan : TANGENT,
in float3 bin : BINORMAL,
in float3 nor : NORMAL,
out float4 oPos: POSITION,
out float3 oTan : TANGENT,
out float3 oBin : BINORMAL,
out float3 oNor : NORMAL,
out float3x3 TBN: TEXCOORD1,
out float2 oTex : TEXCOORD0,
out float4 oPosV : NORMAL1,
out float4 oPosW : NORMAL3,
out float4 oNorV : NORMAL2)
{
float3x3 TBNWorld;
oPos = mul (pos, mWVP);
oNor = (mul (nor, mW)); //-> NORMALIZATION WAS HERE!
oBin = (mul (bin, mW));
oTan = (mul (tan, mW));
TBN [0] = oTan;
TBN [1] = oBin;
TBN [2] = oNor;

oPosV = mul (pos, mul(mW, mV));
oPosW = mul (pos, mW);
oNorV = mul (oNor, mV);
oTex = texC;
}


I don't want to put the pixel shader here, cuz it's too long and I dont't want to take your time. It contains 1 directional, 3 point and 1 spot light calculations (including attenuation calculations) and adds'em up.

Quote:
How many pixels did get shaded? How does that compare to the 3.1 million vertices you described? (a full 1920x1080 screen has 2,073,600 pixels).

It's working on 1024x768-32 bpp. Scene contains a lot of meshes (totally 3.1Million triangles).

Share this post


Link to post
Share on other sites
Probably you were normalizing the normal vector of polygons that were being culled so they didn't even appear on the final image...

If you were normalizing approx. 3.1 Million polygon normals, when you moved normalization to pixel shader you reduced the number of normalizations because as clb said a full 1920x1080 screen has 2,073,600 pixels and you probably are using a lower resolution (for example 1440x900 has 1,296,000 pixels so its 1,804,000 less normalizations...)

Share this post


Link to post
Share on other sites
Also, if you don't need the normalized normal in the vertex shader don't normalize it there. You need to normalize it in the pixel shader anyway, since the interpolated vector is most likely not unit length even if the input normals are unit length.

Share this post


Link to post
Share on other sites

This topic is 2662 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this