[SOLVED]Geometry disappears, ATI are rubbish

Started by
7 comments, last by reaper93 15 years, 3 months ago
I am having trouble trying to get an object to render when I am using tex2dlod in my vertex shader. A few things before I start, my graphics card supports Shader Model 3.0 and I am compiling using SM3.0. I have my CullMode set to None and FillMode set to Wireframe. I am rendering some geometry from a triangle strip which works fine with the following code:

VS_OUTPUT WaterVS(VS_INPUT IN)
{
	VS_OUTPUT OUT = (VS_OUTPUT)0; 

    float3 WorldPosition = mul( float4(IN.Position, 1.0f), WorldMatrix );   
    OUT.Position = mul( float4(WorldPosition, 1.0f), ViewProjMatrix );
     
    return OUT;
}



However I introduced a vertex texture to read in data and offset the y position of the vertex on the GPU, I have done this before using another application and it has worked fine although I can't quite figure out whats going wrong, when I add the following code, the geometry just disappears... all I am doing is offsetting the y by a small amount between 0 - 1 depending on the heightmap value. I have simplified the shader down to just reading in texture coords now.

VS_OUTPUT WaterVS(VS_INPUT IN)
{
    VS_OUTPUT OUT = (VS_OUTPUT)0; 

     //Offset y coord based on heightmap data
    IN.Position.y = tex2Dlod(TexS, float4(IN.TexCoord0, 0,0)).r;
    
    float3 WorldPosition = mul( float4(IN.Position, 1.0f), WorldMatrix );   
    OUT.Position = mul( float4(WorldPosition, 1.0f), ViewProjMatrix );
     
    return OUT;
}

technique Water
{
    pass P0
    {
		CullMode = None;
		FillMode = Wireframe;
		VertexShader = compile vs_3_0 WaterVS();
		PixelShader  = compile ps_3_0 WaterPS();
    }
}



Now here's the strange part, when I run it through PIX, it shows the geometry as rendering on the view port, I also debugged the shader and sensible values are passed on. Like I said above if I remove the tex2Dlod call and replace it with a static value e.g IN.Position.y = 1.0f, it renders fine. As you can see PIX shows the geometry on the viewport, but in the app I see nothing After doing some troublshooting I can't think of any obvious place's to look to solve the issue, any ideas/help appreciated.
Advertisement
Wow I have fixed it, it turns out ATI do not conform fully to the SM 3.0 spec... I switched my D3DCREATE_SOFTWARE_VERTEXPROCESSING, when creating my device and hey presto it works with terrible frames... sigh

Card is an Radeon 512MB X1950XTX for anybody interested... looks like I will be ordering myself a nvidia instead ^_^
Yeah the ATI X1000-series didn't support vertex texturing. Technically it was still conforming to the SM30 spec, since it was not specified which texture formats a device had to support for vertex shading so ATI just didn't support anything of them (you're supposed to query a format with CheckDeviceFormat and specify D3DUSAGE_QUERY_VERTEXTEXTURE to make sure you can use it for vertex texturing).

Nvidia 6 and 7-series did support vertex texturing for a few floating-point formats, but it was pretty horrifically slow.
Ok cheers MJP, do you have any idea on how VT performs on the newer GPU's? Also do all nvidia cards that advertise SM3.0 compliant support VT?
Quote:Original post by reaper93
Ok cheers MJP, do you have any idea on how VT performs on the newer GPU's? Also do all nvidia cards that advertise SM3.0 compliant support VT?


All of the DX10-capable GPU's from ATI and Nvidia perform much much better with vertex texturing, since they no longer have dedicated vertex and pixel shading units (they're unified shaders, so the vertex shaders have access to the same texturing functionality as the pixel shaders).

AFAIK all of the Nvidia SM3.0 GPU's supported vertex texturing. Some of the lower-end models might not have, but I'm not sure about that.

Anyway, ATI Cards are robbish really. I never will buy another one. NVidia is much better.
I'm tired of blocks and artifacts.
Quote:Original post by MJP

All of the DX10-capable GPU's from ATI and Nvidia perform much much better with vertex texturing, since they no longer have dedicated vertex and pixel shading units (they're unified shaders, so the vertex shaders have access to the same texturing functionality as the pixel shaders).

AFAIK all of the Nvidia SM3.0 GPU's supported vertex texturing. Some of the lower-end models might not have, but I'm not sure about that.


Ok thanks MJP, I will probably go get a DX10 GPU, can pick up a decent one for around 100 these days, was abit weary of upgrading considering the X1950XTX is already a fast card (at least for the games I play)
Quote:Original post by XVincentX
Anyway, ATI Cards are robbish really. I never will buy another one. NVidia is much better.
I'm tired of blocks and artifacts.


Then you are doing it wrong; until the G80 series ATI were well known for having the better visual quality of the two manufactures. With the G80, and after, NV have pulled level.

The only 'blip' on ATI's record is the, much debated, way they handled VTF in their pre-unified cards. There are those, like myself, who think they did the right thing by not including it because it would have been slow and expensive, and those who think they did the wrong thing (and somehow 'cheated' despite the spec being poorly worded) and will bang on about it forever.

This, however, is all in the past and right now, imo, ATI/AMD have the best graphics card solutions out there when it comes to both technical engineering AND price:performance. I look forward to the next round of things with DX11 to see how this shakes out.
I just replaced my X1950XTX with a HD4870, mainly because the main games I play use the Source engine, which generally gets better performance over nvidia, either way its SM4.0 so Vertex Texturing should be fine... will have a dive into DX10 now too :)

This topic is closed to new replies.

Advertisement