texkill and GeForce
Hi everybody!
I'm doing alphatest on R16G16F via texkill:
texld r0, t0, s0 ; load colormap
sub r1, r0.a, c0.r ; c0.r contains alphaRef value mapped into [0..1] range
texkill r1 ; kill pixel if its alpha is below alphaRef
...
On all ATI's cards this works just fine, but on GeForceFX I see some strange and ugly picture - some tris are alphatested, but looks like texcoords are distorted, while others are completely opaque.
Any help will be appreciated
Thank you
According to te DX docs, when sampling a D3DFMT_G16R16F texture with texld the A and B values are filled with default values of 1.0.
I'm not sure how loading an alpha value from a texture without an alpha channel in it is a good idea.
At a minimum, you have to check the spec to see what the correct behavior is supposed to be.
Even so, there are often driver bugs lurking in unusual uses of the api.
Hopefully that is just a typo.
At a minimum, you have to check the spec to see what the correct behavior is supposed to be.
Even so, there are often driver bugs lurking in unusual uses of the api.
Hopefully that is just a typo.
Sorry guys, i've mean that R16G16F is a rendertarget, source texture has alpha, of course =)
I've found that texkill behaviour on FX is very strange. RHW vertices are OK, but with untransformed I receive distorted texcords.
Same thing on REF =(
Already tried about 10 detonators =(
BTW, DX8.1 ps.1.4 works fine
I've found that texkill behaviour on FX is very strange. RHW vertices are OK, but with untransformed I receive distorted texcords.
Same thing on REF =(
Already tried about 10 detonators =(
BTW, DX8.1 ps.1.4 works fine
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement