Sign in to follow this  
Malder1

How to disable globally enabled antialiasing for my 3D application?

Recommended Posts

We're creating 2D over Direct 3D application and when user globally enabled antialiasing in control panel of video card (ATI or NVIDIA) we have problem with blurred pictures (if 2xQ mode in NVIDIA) and problem with rendertarget when quads drawn directly to screen (edges blurred) or via rendertarget (edges not blurred) for post filters. Or disabled Vsync makes animation more jerky. And many other problems. And I can't control many important parameters! And user may be will not guess that reason of his problem in changed global parameters of video card that he've adjusted some time ago. Of course we set no antialiasing of initialization of Direct 3D, but ATI and NVIDIA even in future Windows Vista RC1 allow user to set global settings for 3D (for all 3D applications). Of course, Microsoft asked NVIDIA and ATI to make personal exception for their 3D Aero Glass composition engine and for their Media Center. But we also need to fully control quality and parameters of 3D. Like full control of quality in GDI graphics. I wrote many times to ATI and NVIDIA. Only ATI replyied unofficially that they understand this problem but don't promiss to change anything at least in nearest future. I know only one way for NVIDIA video cards. NVIDIA allows to create individual profile for specified EXE file and I can create it by adding record to HKLM section of Registry. However in Windows Vista I may not write anything to HKLM because of User Account Protection.

Share this post


Link to post
Share on other sites
If the user is overriding settings I would guess that this is what he/she wants and they *should* be (but aren't necessarily) aware of the problems with this. I guess a workaround could be to check whether antialiasing is set after creating the device and then warn the user of this.

Share this post


Link to post
Share on other sites
Both moments don't work unfortunately.

I can't detect that drivers overrided my settings.

And user can't guess that our 2D application in fact uses Direct 3D. Yes, he understand when he start usual 3D game, but not 2D multimedia application which looks as written in GDI or DirectDraw. And describing of this problem in readme file is a worst solution on my opinion.

Share this post


Link to post
Share on other sites
Quote:
Original post by Malder1
I can't detect that drivers overrided my settings.


I think at least nVidia stores this in the registry? I cant remember the path, but if you'll look in to the "coolbits" hack, you'll get the path there.
And on Linux, nVidia has an environment-variable that overrides the fsaa.

ch.

Share this post


Link to post
Share on other sites
Enabling MRTs should disable FSAA. IIRC, when you disable MRTs, FSAA isn't turned back on. This is hardly the correct way to do this, but might help in some situations. Test this carefully to make sure it even works :-/

As for VSync, you could either detect high refresh rates, and notify the user, or you could rewrite you code to handle running with a variable framerate. This would be a good idea anyways, since framerates might differ depending on monitor types (60, 80, 100hz, for example).
Lastly, sometimes a very high framerate might become a problem, in which case you'd need to just slow your application's rendering down by only drawing after a certain amount of time has passed since the last frame.

Hope this helps.

Share this post


Link to post
Share on other sites
I'm a bit surprised right now. I tested creating a device with my nvidia driver set to override AA and after creating the device the PresentationParameters struct didn't report the correct format - this could actually create some errors. This is a really hackish solution, but you could detect whether the device is using antialiasing by rendering something simple and read back the pixels that were rendered - comparing it to the expected output and warning the user if AA is detected. DX compatible hardware is supposed to behave in a certain way so it should be safe to assume how the rendering will behave.

Edit:
For instance rendering a flat shaded black triangle splitting a 2x2 pixels white texture in two diagonally should only result in black and white pixels.

[Edited by - e-u-l-o-g-y on September 8, 2006 7:17:46 AM]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this