everything I have done in GPU doesnt needs those extra bits, and I think GPUs could do more things and would have support for more if it was based on unsigned integers.
for example a texture that is an array could have a load that does the same the read() from c++ does.
Or blending could have the bitwise operations: and, or, xor.
The best concecuence is that it will save lots of GPU memory.
GPUs are based on floating point operations since that is what most 3D graphics operations require, it is possible to squeeze an almost insane number of shader units onto a small chip simply because each shader unit is very basic. support for integer operations would have to be added on top of the floating point support (it cannot replace it) thus making each unit larger, more expensive and thus reducing the number of units you can fit in a single GPU. (The Geforce GTX 690 has 2x1536 shader units, that number would not be achievable if each unit were significantly bigger than they are)
Edited by SimonForsman, 05 March 2013 - 03:41 AM.