HDR faking in ogl
Hi,
regarding an opengl implementation of what is generally considered "fake" HDR (in which 8 bit render targets get downsampled + gaussian blurred + filtered back up then blended with the back buffer).
All the papers I can find say to render scene once to back buffer, copy to a smaller render target (what opengl call - if any - will do this?) then apply the usual process. If there is no opengl equivalent to the dx strechrect() then I will use a full size render target and compisite that with the results of gaussian filtering a downsampled version.
Just draw it with a screen aligned quad at the size you require, thats the simplest way to down/up sample the data I can think of...
so render to target then downsample and not copy from backbuffer with some implicit downsampling?
thats not hdr or even 'fake' hdr. i assume u mean that horrible glow effect everyone seems to be using nowadayds
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement