how can i use texture as int array in pixel shader ?

Started by
1 comment, last by wpyax 15 years, 4 months ago
i want a 256*256 int array in pixel shader, but when i use the texture ,i find the data what i use tex2d() read , is wrong. i guess that'e sample. so how can i do ?
Advertisement
Which shader model and API are you using?

Prior to D3D10 there was no support for integer values or an integer instruction set so you'll get the floating point representation of your number, typically mapped to the [0..1] range.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

i use d3d9

This topic is closed to new replies.

Advertisement