HLSL Non-Normalized tex coords in pixel shader

Started by
3 comments, last by JB3DG 4 years, 11 months ago

Hi guys

I am having a rather odd pixel shader problem. I have attempted to implement a simple horizontal hatch brush pattern using the y dimension of a 2D quad, where it alternates colors every 4 pixels.

It works in my C# test app, but not in my C++ project.

Here is my shader code:


struct PS_INPUT
{
	float4 p : SV_POSITION;
	float2 t : TEXCOORD0;
	float2 tc : TEXCOORD1;
};

float4 PS_FILL(PS_INPUT input) : SV_Target
{
	if (fclip != 0)
	{
		float clip = ClipTex.SampleLevel(Sample, input.tc, 0);
		if (clip == 0.0f)
			discard;
	}
	if (brushtype.x == 1)
		return color1;
	else
	{
		if (brushtype.y == 100)
		{
			if (fmod(input.t.y, 8) <= 4.0f)//Fault is here
				return color1;
			else
				return color2;
		}
		else
			return color2;
	}
}

When the quad is passed to the pixel shader, I put actual normalized texture coordinates in the tc member of PS_INPUT, which are used for sampling a mask texture for clip regions.

The t member however, gets the actual dimensions of the quad in pixels. So for the top left quad t would be 0,0, while the bottom right quad would have width, height in t.

Below is a screenshot where you can see the C# app's output. As you can see, the alternating color pattern is clearly working in the brown half of the attitude indicator display.

uGqXaEk.png

Here is my C++ program's output:

n211bpx.png

Now I must stress that the 2 programs are using functionally identical code, as well as completely identical shader code. I also ran the VS2017 graphics debugger on my program to check the values going into PS_INPUT::t in my geometry shader, and I saw the expected pixel dimensions. But when I check the pixel shader, it tells me that input.t.y's debug info has been optimized away, even though I explicitly disabled optimization and enabled shader debugging during shader compilation. So I tried storing input.t.y into a local variable, which consistently showed a value of 0.5 no matter which pixel I selected for debugging. So for some reason DX11 is either normalizing my TEXCOORD inputs, or it is not linearly interpolating in the area between the vertex positions of the quad. Any ideas?

Advertisement

Could you post the rest of your shader code? What's "brushtype" and "fclip"?

All those 'if' statements are not pretty in my eyes, I would simply do different shaders for all combinations (using #if etc)..

Maybe the optimiser gets confused and leaves out the fmod() part because it's inside two conditions?

Does it work if you remove all 'if's..?

.:vinterberg:.

On ‎5‎/‎15‎/‎2019 at 2:48 AM, MJP said:

Could you post the rest of your shader code? What's "brushtype" and "fclip"?

brushtype.x is a constant buffer flag for switching between solid and hatch brushes. brushtype.y is the hatch brush type (of which there can be over 100). fclip is just a flag telling the shader that a clip region is active, so it will sample the clip texture and discard any pixels outside of the clip mask.

 

On ‎5‎/‎15‎/‎2019 at 11:47 AM, vinterberg said:

All those 'if' statements are not pretty in my eyes, I would simply do different shaders for all combinations (using #if etc)..

Maybe the optimiser gets confused and leaves out the fmod() part because it's inside two conditions?

Does it work if you remove all 'if's..?

I'm not so sure about that. The fmod does absolutely get hit when I step through the instructions in VS2017 shader debug. But for some reason the texcoord input is either being normalized (I only ever see values of 0.5 if I put it in a local variable), or it is not being interpolated.

This topic is closed to new replies.

Advertisement