Sign in to follow this  

clip() instrinsic on NVidia graphics cards

This topic is 2030 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm currently developing my own graphics engine and decided to try it out on other graphics cards. I'm developing it on an AMD graphics card where it works perfectly fine. But on my friends NVidia card it went completely crazy.
Since I'm using a deferred renderer I decided to just clip materials where the transparency is below 0.5f. I decided not to implement actual transparent materials since deferred renderers don't really support them (even though I'll probably come up with a solution sometime later in the development).
But the clip() instrinsic completely randomly clips pixels away on his graphics card. It even clips entirely different pixels each frame.

Here's how it looks like:
[img]http://unlimitedengine.us.to/nvidia_clipping_issues.png[/img]

And here's the code (reduced to what's important):
[source lang="cpp"]PSOut PSMain(PSIn Input)
{
PSOut Output;

float4 albedo = AlbedoTexture.Sample(Linear, input.Texcoord);
clip(albedo.a - 0.5f);

Output.Albedo = albedo;

return Output;
}[/source]

I'll probably solve it by using Alpha Blending with just the values 0 and 1. But I'm just interested in why this might be happening and if there's still a way to use the clip() instrinsic. Edited by CryZe

Share this post


Link to post
Share on other sites
I've seen similar artifacts in a case where the GPU was overheated. If this is not the case, I would suspect faulty GPU memory or a firmware bug.

Also, do verify (with PIX) that the alpha channel of the texture is actually filled correctly.

Share this post


Link to post
Share on other sites
[quote name='Nik02' timestamp='1340736439' post='4953095']
...and check depth/stencil state while at it.
[/quote]
QFE'd/upvoted. Could be a good old fashioned driver bug, too. If it works fine on the reference rasterizer, you've broken the driver somehow. Congrats! ;)

Share this post


Link to post
Share on other sites
Yea, the artifacts look like semi-random memory, which can simply be the contents of an uninitialized depth buffer (or corrupted memory in the worst case).

There may be some differences between AMD and NVidia drivers about depth buffer validation. For example, one might allow different dimensions for the depth buffer and render target, while the other does not. This particular trait is, however, well defined in D3D spec and it [i]should [/i]work the same way on both.

Share this post


Link to post
Share on other sites

This topic is 2030 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this