how to store 32*2-bit int2 to textures?

Started by
3 comments, last by ET3D 16 years, 12 months ago
att, although D3DFMT_G32R32F can store 2 values of 32 bits, they are of float format. how to write shader on handling a structure of struct int2{ int x; int y;}?? Thanks!
Advertisement
You can't directly. You can theoretically use D3DFMT_A16B16G16R16 to store 4 shorts and perform calculations that will end up treating them as two ints. Note that shaders use floating point and can't handle 32 bit integers anyway. It really begs the question of why you need such support. If you truly need it, you might prefer going the DX10 way.
Graphics cards are floating-point processors. Why do you need to use ints, and not floats?
NextWar: The Quest for Earth available now for Windows Phone 7.
Thanks. I use int32 because I'm doing a GPGPU thing, and the value in other areas often has a 4-byte integer form..
My question is: does DX10 supports int32?
I installed DirectX2007.April on WindowsXP sp2, in sdk's "D3DFORMAT" page, I can't find "int32" alike. What do you mean by " going the DX10 way"? Thanks!

Quote:Original post by ET3D
You can't directly. You can theoretically use D3DFMT_A16B16G16R16 to store 4 shorts and perform calculations that will end up treating them as two ints. Note that shaders use floating point and can't handle 32 bit integers anyway. It really begs the question of why you need such support. If you truly need it, you might prefer going the DX10 way.


I meant using Direct3D 10 instead of D3D9. It includes support for integer math in shaders, and other things that help with GPGPU work. Note that it's still geared mainly towards floating point, so if you're looking mainly to working with integer math, the GPU might not be best for this. Note also that Direct3D 10 requires Vista.

This topic is closed to new replies.

Advertisement