Death buffer

Started by
20 comments, last by Mikvix 22 years ago
quote:
128 bits ?
LOL
I knew that some SGI boards supported 64bits which is monstruous.
128bits is just... monstruously monstruous

I know, but they have a programmable framebuffer, means you can assign as much components you like to each channel, as long as you don''t run out of framebuffer memory. So you could even get a 256bit depth buffer, but I don''t know if there would be much room left for your r,g,b image On the other hand, with 1GB framebuffer memory, there is some room to play around. I *love* those machines, unfortunately only at work, I would like to have such a thing at home...

quote:
and AFAIC the GeForce family supports 32bits depth buffer (if you don''t use the stencil buffer).

Hmm, I think it will still return a 24bit depth buffer + 8 unsused bits, AFAIK it''s hardwired on the board. I''m sure the GF2 does that, pretty sure the GF3 as well. But I''m not sure about the GF4, could be that you can get a real 32bit buffer there.
Advertisement
quote:Hmm, I think it will still return a 24bit depth buffer + 8 unsused bits

Wow, I''ve just tried it and you''re right ! (I''ve tested it on GF3)
I can''t believe it. So many ppl told me not to use the stencil buffer in order to have better depth precision with GeForces. Darn''em.

This topic is closed to new replies.

Advertisement