HI! I saw something puzzling today. Someone using 16 bit unsigned ints as texture co ordinates, with a 0x3c00 as`1.0f` and of course 0x0000 as `0.0f`.
These are then fed into the GPU as if they were 16 bit floats!
However I`m completely puzzled as to how and why this would even work. Any ideas?