HLSL float vs C++ float

Started by
7 comments, last by JB3DG 9 years, 11 months ago

Hi guys

Im curious, do HLSL floats have different numerical ranges to C++ floats? As far as I recall, C++ floats can have a range of 1.2E-38 to 3.4E38.

I am getting a odd response from a geometry shader here. Here is the vertex XYZ values after being multiplied by the world matrix on the C++ side:

62909.550781f, 43.891846f, 129223.367188f

and here is the code in the geometry shader:


	float4 vert = mul(sprite[0].pos, g_world);
	
	float4 center = float4(vert.xz*0.000043192f, 0, 1);

	//top right
	if(abs(center.x) < 1.0f && abs(center.y) < 1.0f)
	{
		v.p = center+float4(0.007702941f, 0.002502837f, 0, 0);	
		triStream.Append(v);
		//bottom right
		v.p = center+float4(0.004760679f, -0.006552513f, 0, 0);
		triStream.Append(v);
		//top
		v.p = center+float4(0, 0.008099352f, 0, 0);	
		triStream.Append(v);
		//bottom left
		v.p = center+float4(-0.004760679f, -0.006552513f, 0, 0);
		triStream.Append(v);
		//top left
		v.p = center+float4(-0.007702941f, 0.002502837f, 0, 0);
		triStream.Append(v);
		triStream.RestartStrip();
	}

If we take the Z value of 129223.367188 (which is getting shifted to the Y value for the display) and multiply by 0.000043192f, I should get a value of 5.5814 etc which would put the vertex well outside of the viewport limit of +-1.0f. However, despite the limiting if statement in the above code which prevents a triangle from being sent out if it is greater than 1.0f, the vertex still gets drawn around 0.5f-0.6f on the display.

Any ideas as to why? Transposing the world matrix has no effect. This just makes no sense.

Advertisement

Transposing the world matrix has no effect.

First - if you're using row-major matrices on the CPU side, you transpose for the GPU side (unless you're using an identity world). No sense in debugging if you don't set it up correctly.

Second - follow the data. Find a point in your code where the values are correct. Find the next point in your code where the values go wrong. The error must be occurring at that point in the code. Narrow it down to a single line of code where the inputs are correct but the result is wrong.

Assuming you're rendering a point list, guesses/suggestions:

If you can't debug the GS but can look at VS values, try multiplying the vertex by g_world in the VS and pass that to the GS, perhaps also swizzling and pre-multiplying in the VS.

Is your GS input sprite[1]?

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Yes my GS input is sprite[1].

I tried setting up my own mul function after verifying which values go where when comparing the world matrix on the C++ side with the HLSL one. But it still gives the same results. Tried moving the positioning to the vertex shader and leaving the geometry shader to only output the triangles. No luck.

FWIW here is my own MUL function.


float4 MUL(float3 v, float4x4 m)
{
	float x = (m[0][0]*v[0]) + (m[0][1]*v[1]) + (m[0][2]*v[2]);
	float y = (m[1][0]*v[0]) + (m[1][1]*v[1]) + (m[1][2]*v[2]);
	float z = (m[2][0]*v[0]) + (m[2][1]*v[1]) + (m[2][2]*v[2]);
	return float4(x, y, z, 1);
}

I cant use shader debuggers as the program is running both DX11 and DX9. They cant seem to work in such an environment. However I did tests with each element of the world matrix and the vectors to make sure I am not making any incorrect assumptions before I created the above function. All tests returned the expected values. So the only conclusion I can come to is something is happening to the values after I set the vertex buffer and give the draw call. I have no way to determine what.


my own MUL function

Why? Don't reinvent the wheel for stuff like that unless you've intimately familiar with storage order. Even then, your routine isn't likely to save any cycles.

Where do you transform to screen space? I.e., mul(pos,view); mul(pos,projection)?

You've got too many possibilities for error if you can't debug shaders.

1. transpose matrices when you send them to the shaders.

2. use the instrinsic functions. E.g., mul()

E.g., in your VS, use output.Pos = mul(input.Pos, g_world);


something is happening to the values after I set the vertex buffer and give the draw call. I have no way to determine what.

Lack of shader debugging means you have to be precise but creative. The first step is to be precise at every step. Inventing your own subroutines, etc., is NOT the way to do it. You DO have a way to determine the problem(s). You just have to be precise and imaginative. Take one step at a time, and, in future, don't write a hundred lines of code, expecting it all to work. wink.png

WRT creativity, you can start (for instance) with a shader that just converts vertices to screen space, render that, and make sure that the VS and PS work. Then add a geometry shader, but have it do nothing but passthrough. Make small changes, and test each step.

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Its a 2D display. I am ignoring the Y axis in the source verts and putting the Z axis into the Y to get a top down view.

This line:

float4 center = float4(vert.xz*0.000043192f, 0, 1);

scales the values down.

The vertex values are local space only. The world matrix repositions them. There is no view or projection matrix because as I said earlier it is a 2D display so I am only scaling down the values to fit inside the +-1.0f viewport space with the intention of 0 being the center of the display.

As I stated already I have done tests where I transpose the world matrix before sending it to the shader but this has no effect. Whether I use my own MUL function or the intrinsic mul the result is always the same. I only created my own MUL to verify the results that I would based on tests to determine the positions in the matrix of the various values.

Bottom line, no matter what I do, the picture I get on the display doesn't correspond with pen and paper based calculations based on the vertex*world output to file that I posted.

(62909.550781f, 129223.367188f)*0.000043192f should give an output of (2.7171893f, 5.5814156f). Yet on screen the vertex is showing up below and right of the display center when it should be completely offscreen. If you make the integer parts of the value 0 so it sits between +-1.0f the vertex should appear above and right of the center as far as I recall. For some strange reason, it is just totally defying logic.


As I stated already I have done tests...

... none of which helped you fix the problem. I'm suggesting you not guess what the problem may be, but take steps to determine what the problem is.

Do the vertices render correctly without the GS? I.e., if you do all the calcs in the VS and just render the center point?

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Well will test things without the GS. However I would be curious to know if there is any setting on the C++ side that could have this effect. If I think about it, the output on the screen does correlate with (2.7171893f-2.0f, (5.5814156f-5.0f)-1.0f) which gives (0.7171893f, -0.4185844f). Almost as if a modulo is being performed on the vertex so that it always appears on screen instead of culling it. Its not quite a modulo (otherwise the Y axis would be appearing above and right of the center instead of below and right) but I am not quite sure how else to describe it. Almost as if the viewport gets shifted up and to the right until the vertex appears.


will test things without the GS

IMHO, that's the right place to start. As you don't have shader debugging, consider: you have 100 lines of code that you can't debug directly. So don't. Debug 2 or 3 lines of code at a time.

I've been in your situation - no shader debugging available - just the gozintas and gozoutas. So I sat with the shader open in Visual Studio, edited a line or two and hit F5 (run). Repeat as needed. I've hit 4 or 5 runs a minute doing that. Sometimes it's best to start with hard coding some values in the VS. Does (0,0,0) show up center-screen? Etc.

Again, for your own sanity, quit guessing about the numbers. You're prejudicing yourself to miss the solution when an indication of the real problem appears. The chances that you've discovered a unique problem no-one else has ever seen before are slim-to-none. Until you can prove otherwise, assume the machine is performing as expected, that it's just doing what your code tells it to do, and find out where you've coded incorrectly.

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Well I finally got fedup with the whole thing, decided to grab hold of a little D3D10_1 app that I had from the old days of learning DX and edited it to use my shader. Then I grabbed the vertices and world matrices from the program I have been struggling with and saved them to a file. Loaded that file in the app and am making use of the VS2012 HLSL debugger. The only problem now is...I am using the ID3D10Effect system for my shaders. And I cant seem to find any details on setting up the HLSL debugger to recognize my HLSL source.

Any one got a crash course on how to set the thing up? MSDN is of absolutely no help here.

This topic is closed to new replies.

Advertisement