how can i use texture as int array in pixel shader ?
i want a 256*256 int array in pixel shader, but when i use the texture ,i find the data what i use tex2d() read , is wrong. i guess that'e sample. so how can i do ?
Which shader model and API are you using?
Prior to D3D10 there was no support for integer values or an integer instruction set so you'll get the floating point representation of your number, typically mapped to the [0..1] range.
hth
Jack
Prior to D3D10 there was no support for integer values or an integer instruction set so you'll get the floating point representation of your number, typically mapped to the [0..1] range.
hth
Jack
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement