128 bit pixels? You're saying that they have 32 bits for each channel plus 32 bits alpha?
32 bit color depth means = 8 bit per channel (RGB) + 8 bit alpha. 32 bit per pixel.
128 bit per color, according to my maths, means 30Mb per frame on a 1080p display. Which means that hardware like that could push 4k resolution (32 bit per pixel) without breaking a sweat and I don't think it works like that just now
And yes, 32 bpp is what consoles and PCs have been using for years now. Both doing HDR in the GPU.
EDIT: Ohhh, google fu tells me this: Each color is stored in a float (ie, 32 bit) for HDR rendering. Though I do have heard of the FP16 format used for HDR. So that means, 8 bit per channel + 8 bit alpha for output, possibly 32 bit per channel for computation.