I have a problem I haven't quite managed to come up with a solution for. I'd like to store a primitive ID value in my G-buffer for implementing SRAA. I want to have as many unique IDs available as possible to avoid "collisions" where two touching triangles have the same primitive ID due to overflow. Therefore I really want to utilize all 65536 values in the 16-bit floating point texture channel. If I have a value 0-65536, is there any way to map each int value to a unique 16-bit floating point value? The stored IDs will then be used for equals comparisons, and those comparisons can be done in the packed float format, so there's no need to convert the value back to an integer again when reading the ID.
Although I found the intBitsToFloat() function in GLSL I'm not sure it'll work as I expect when using a 16-bit floating point render target to store the result in. Additionally, this function is only available in GLSL 3.30 which is unavailable on Mac.
I don't mind an approximate solution as long as as much precision is retained as possible for the ID. It's especially important that following IDs do not map to the same 16-bit floating point values (for example 1 and 2 both map to X), since triangles with similar IDs will most likely be close to each other on screen, and I need to be able to differentiate between them.