Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


clip() instrinsic on NVidia graphics cards


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
4 replies to this topic

#1 CryZe   Members   -  Reputation: 768

Like
0Likes
Like

Posted 26 June 2012 - 02:49 AM

I'm currently developing my own graphics engine and decided to try it out on other graphics cards. I'm developing it on an AMD graphics card where it works perfectly fine. But on my friends NVidia card it went completely crazy.
Since I'm using a deferred renderer I decided to just clip materials where the transparency is below 0.5f. I decided not to implement actual transparent materials since deferred renderers don't really support them (even though I'll probably come up with a solution sometime later in the development).
But the clip() instrinsic completely randomly clips pixels away on his graphics card. It even clips entirely different pixels each frame.

Here's how it looks like:
Posted Image

And here's the code (reduced to what's important):
[source lang="cpp"]PSOut PSMain(PSIn Input){ PSOut Output; float4 albedo = AlbedoTexture.Sample(Linear, input.Texcoord); clip(albedo.a - 0.5f); Output.Albedo = albedo; return Output;}[/source]

I'll probably solve it by using Alpha Blending with just the values 0 and 1. But I'm just interested in why this might be happening and if there's still a way to use the clip() instrinsic.

Edited by CryZe, 26 June 2012 - 02:57 AM.


Sponsor:

#2 Nik02   Crossbones+   -  Reputation: 2914

Like
0Likes
Like

Posted 26 June 2012 - 11:54 AM

I've seen similar artifacts in a case where the GPU was overheated. If this is not the case, I would suspect faulty GPU memory or a firmware bug.

Also, do verify (with PIX) that the alpha channel of the texture is actually filled correctly.

Niko Suni


#3 Nik02   Crossbones+   -  Reputation: 2914

Like
1Likes
Like

Posted 26 June 2012 - 12:47 PM

...and check depth/stencil state while at it.

Niko Suni


#4 InvalidPointer   Members   -  Reputation: 1443

Like
0Likes
Like

Posted 26 June 2012 - 05:08 PM

...and check depth/stencil state while at it.

QFE'd/upvoted. Could be a good old fashioned driver bug, too. If it works fine on the reference rasterizer, you've broken the driver somehow. Congrats! ;)
clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

#5 Nik02   Crossbones+   -  Reputation: 2914

Like
0Likes
Like

Posted 27 June 2012 - 01:03 AM

Yea, the artifacts look like semi-random memory, which can simply be the contents of an uninitialized depth buffer (or corrupted memory in the worst case).

There may be some differences between AMD and NVidia drivers about depth buffer validation. For example, one might allow different dimensions for the depth buffer and render target, while the other does not. This particular trait is, however, well defined in D3D spec and it should work the same way on both.

Niko Suni





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS