• Advertisement
Sign in to follow this  

how can i use texture as int array in pixel shader ?

This topic is 3344 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

i want a 256*256 int array in pixel shader, but when i use the texture ,i find the data what i use tex2d() read , is wrong. i guess that'e sample. so how can i do ?

Share this post


Link to post
Share on other sites
Advertisement
Which shader model and API are you using?

Prior to D3D10 there was no support for integer values or an integer instruction set so you'll get the floating point representation of your number, typically mapped to the [0..1] range.

hth
Jack

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement