Sign in to follow this  

how can i use texture as int array in pixel shader ?

Recommended Posts

Which shader model and API are you using?

Prior to D3D10 there was no support for integer values or an integer instruction set so you'll get the floating point representation of your number, typically mapped to the [0..1] range.


Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this