Sign in to follow this  
ava

antialiasing, hdr and a geforce 8800

Recommended Posts

ava    122
Hello, We have ourselves a geforce 8800 here and a game with HDR. You probably see my question coming :-) On the geforce 7800 we could only have AA without HDR. Many of the praise for the 8800 is that HDR with AA has become possible. But what do I have to change so this indeed works? I set the settings of my graphics card to use AA (4x, 8x or 16x), this works fine, but when I enable HDR it's gone again. So, what do I have to change/set/alter that this works? Thanks, Alex Vanden Abeele

Share this post


Link to post
Share on other sites
Kambiz    758
A texture as rendering target does not supports anti-aliasing, you have to render everything to your frame buffer and then copy it to a texture for post-processing.

Share this post


Link to post
Share on other sites
ava    122
Indeed, but the problem is that the frame buffer cannot contain values in high range - I suppose, because that's the reason why you should post process high dynamic range values, otherwise the frame buffer could just display the high range values.
Am I wrong here?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
In DX9 at least, you will need to create an MSAA HDR rendertarget and StretchRect to a rendertarget texture.

Share this post


Link to post
Share on other sites
Kambiz    758
OK. What about rendering everything twice, both into the frame buffer and a texture? Only users with a better graphic card will have FSAA and HDR enable at the same time anyway.
If you have few bright objects you could render only them to texture instead of the entire scene.

Share this post


Link to post
Share on other sites
ava    122
Quote:
Original post by Anonymous Poster
In DX9 at least, you will need to create an MSAA HDR rendertarget and StretchRect to a rendertarget texture.


If I understand this correctly, I render to a high range texture, and stretch rect it to the framebuffer, but how can I create an MSAA HDR texture?

Share this post


Link to post
Share on other sites
gjoel    144
As the anonymous poster said, you need to create a multisampled rendertarget. For OpenGL this is described here.
http://oss.sgi.com/projects/ogl-sample/registry/EXT/framebuffer_multisample.txt

- and yes, you do need to render to textures for HDR (16bit) rendering.

Share this post


Link to post
Share on other sites
ava    122
My rendertarget has the format D3DFMT_A16B16G16R16F, I'm using dx9.

What is the format for a MSAA 64 bit rendertarget?

Thanks for all the answers!

Alex

Share this post


Link to post
Share on other sites
jollyjeffers    1570
I'm in a bit of a hurry now so can't write a full reply [wink] Have you downloaded and looked through Nvidia's SDK? It's usually the best resource for these sorts of questions - they use their SDK to give developers the resources to make the absolute most of their hardware, HDR+AA being an obvious candidate [smile]

hth
Jack

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
MSAA isn't specified by the format.

Look at the documentation for CreateRenderTarget; you can specify the multisample quality etc there; you should also call CheckDeviceMultiSampleType to check for hardware support.

Part of your confusion may be that you're currently rendering to a rendertarget *texture* (RTT) - ie created via CreateTexture. You can't specify MSAA for an RTT in DX9; instead you must create an RT and StretchRect the result to a separate surface (RTT or original backbuffer) following rendering.

Share this post


Link to post
Share on other sites
LeGreg    754
It's not that much more complicated..

Here's your pseudo d3d code (all targets listed here have same dimensions !):

// Init

CreateDevice(backbuffer and frontbuffer both called targetA, multisampletype_none, absolutely no automatic depth stencil !).

CreateDepthStencilBuffer(targetZ, Z24X8/Z24S8, multisampletype_2x_4x_8x, quality level 0-7); // <- no Z16 please

CreateRendertarget(targetB, A16B16G16R16F, multisampletype_2x_4x_8x, quality level 0-7 (they must match above !)) // <- this is not a texture !

CreateTexture(textureC, Rendertarget, pool_default, A16B16G16R16F, multisampletype_none); // <- important, see below

// first frame

BeginScene();

SetRenderTarget(targetB);
SetDepthStencilSurface(targetZ);
Clear(both);
RenderScene(); // <- hdr scene as you do usually

StretchRect(From targetB to texture C); // <- no rectangle, matching dimensions please

SetRenderTarget(targetA);
SetDepthStencilSurface(NULL); // <- important

Clear(color); // optional since you're stomping it just below

DrawFullScreenquad(with texture C); // tone mapping pass with a dedicated shader. This should be more complicated if you have bloom or other effects obviously.

RenderInterface(); <- don't tonemap or multisample your interface.

EndScene();

Present();

// [...] now we draw the same frame again and again and again.

// After we're done we release everything.

That's it. Not that complicated, isn't it ?

LeGreg
ps: there are other topics, like what tonemapping formula to use, why there is more aliasing in the hdr version and the like. but let's keep it simple for now..

Share this post


Link to post
Share on other sites
LeGreg    754
And for your pleasure, HL2 version :

CreateDevice(with AA, with automatic depth stencil);
// frame
RenderScene(); // <- do automatic scaling at the end of the shader (color = color * tonemappingconstant).
DrawSomeBloom();
Present();

This doesn't work if you've got to accumulate passes at high quality (limited multipass lighting), and you loose hdr texture for the bloom (cheap bloom instead) etc. But it works for their game at least.

LeGreg

Share this post


Link to post
Share on other sites
delta user    164
@gjoel

So is it possible in OpenGL to use FSAA & HDR together on an GeForce 7xxx (and 6xxx) series card?

Or is that extention only supported by a 8xxx.

Share this post


Link to post
Share on other sites
Enrico    316
Quote:
Original post by delta user
@gjoel

So is it possible in OpenGL to use FSAA & HDR together on an GeForce 7xxx (and 6xxx) series card?

Or is that extention only supported by a 8xxx.

My Geforce 7 supports GL_EXT_FBO_multisample, but I can't use a fp16-format together with it. Maybe some high precision integer format can be used, but I haven't tried that. It must be possible, there is a screensaver by Nvidia featuring HDR and antialiasing...

Share this post


Link to post
Share on other sites
LeGreg    754
Quote:
Original post by wolf
LeGreg: which tone mapping operator do you recommend?


I doubt there is a definitive answer on that. Depends on your scene, on the cost of tonemapping you want etc.

The one frequently used in published games is a simple color = color * scalingconstant. That will avoid the frame being too dark or too bright, but will only allow limited dynamic range on the same frame (unless the constant is recomputed for each pixel of the image), and it will allow for luminance saturation. It has a linear response (minus srgb) on the relevant part of the tonemapping curve contrary to other types (that means color tones are preserved, and filtering works as advertised as long as no pixel is overbright).

The microsoft sdk sample advertises another one :
color = color * scalingconstant / ( 1 + color * scalingconstant)
This comes from a research paper and is derived from a photographic-like tonemapping. "better" dynamic range, no luminance saturation (unless explicitely allowing it by scaling on top of the formula above), non linear response (it's a rational curve) so color will look unsaturated and filtering won't work as well as advertised.

Gah.. it's probably too big a topic to be discussed in this thread (simple question -> simple answer :) ).

LeGreg

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this