Sign in to follow this  
Malder1

How application can forcedly disable antialiasing?

Recommended Posts

Malder1    122
We're developing 2D like application using Direct3D. All is OK and textures are very sharp. But when user set several kinds of full-scene antialiasing (supersampling)- Quincix, 4xS, 8x, 16x (on NVIDIA video cards), it blures even entire texture! Is there a way how to be able disable any antialiasing in my application? I've tried d3dpp -> MultiSampleType := D3DMULTISAMPLE_NONE but it doesn't help.

Share this post


Link to post
Share on other sites
jollyjeffers    1570
Sadly there isn't an easy way to do this, if the user (via the NVidia/ATI control panel) choose to "force AA" then thats whats gonna happen - the driver has the final say in the matter [headshake]

It is, theoretically, possible to "hack" the drivers and force/check various settings, but thats rarely documented and a pretty grey area (e.g. driver revisions, as well as should you override what the user wants?).

Your best bet is probably just to make a note that "This game is not compatible with ___ forms of anti-aliasing, we recommend you disable them whilst playing"

hth
Jack

Share this post


Link to post
Share on other sites
Malder1    122
Thanks for your response. I really wondered, because Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?

Share this post


Link to post
Share on other sites
jollyjeffers    1570
Quote:
Original post by Malder1
Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?

Not quite sure I get you here... you wondering why they recommend using D3D when you get quality issues like you're describing?

It's a debatable topic, but the drivers that allow a user to force particular settings should make it quite clear that they are doing just that - forcing an application to behave in a particular way. In forcing something they have to accept that it might not work.

If the user decides they want to go changing things like that, and it breaks I (as a software developer) am inclined to say - "you took the responsibility, it's your problem now". But the customer is never likely to see it that way - it's my code thats broken, not them [headshake]

Jack

Share this post


Link to post
Share on other sites
DrunkenHyena    805
Quote:
Original post by Malder1
Thanks for your response. I really wondered, because Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?


Just to be clear this isn't an issue with D3D, it's an issue with the driver. Those driver control panels let users do far worse things than just force multisampling.

Make sure your product comes with a FAQ and make sure this is covered prominently.

Share this post


Link to post
Share on other sites
SimmerD    1210
The easiest way to disable AA under d3d is to render the entire scene to a texture instead of the back buffer.

This can have other negative consequences, like lack of Z cull or z compression, but it shouldn't matter for a 2d game.

Share this post


Link to post
Share on other sites
Malder1    122
Thanks for information!

Really this problem doesn't matter in 3D application, where user decides about quality and parameters of a game. But we develope 2D multimedia application and user even will not imagine that it works via 3D mode. And I'm afraid he will not guess that it's necessary to temporary turn off AA.

SimmerD,
Thanks for nice idea!
But when we will have ready texture how to put it to the screen? If I add additional BeginScene/EndScene with DrawPrimitive this final texture will be rendered with antialiasing to the screen?

Share this post


Link to post
Share on other sites
matches81    474
IIRC anti-aliasing has nothing to do with the textures. If you render your scene to a texture, set the texture filter to D3DTEXF_POINT (no guarantee for that, but should be something like that), and render a quad that fits your screen, just the edges of the quad should be anti-aliased, not the texture itself, so this should do the trick.

Share this post


Link to post
Share on other sites
Malder1    122
matches81,
It doesn't help. I've tried to show 512x512 texture to a window 512x512. Each texel exactly maped to screen pixel.
But when I enable *some modes* of full-screen antialiasing (for example, 8x, 16x, Quincix) it blurres *all texture*, not only edges.
However no problem with AA 4x, 2x which blurred edges and nothing more.

Share this post


Link to post
Share on other sites
Syranide    375
And you are sure it doesn't have anything to do with mipmapping. (common problem to forget to turn of mipmapping for 2D)
Sounds strange, but blur from AA sounds even more strange.

Share this post


Link to post
Share on other sites
Malder1    122
I don't know how work 8x, 16x, 4xS, 6xS, Quincix antialiasing modes on NVIDIA video cards, but all described modes blurres and edges and entire texture.

Share this post


Link to post
Share on other sites
SimmerD    1210
4xs, 8xs and 16x modes under d3d are something that I came up with a few years ago in the gf4 timeframe. They do a combination of super and multisampling, so yes, they can blur things at times.

Make sure that you are doing point sampling on the quad, as mentioned before, to reduce any blurring you might get.

If the offscreen surface you render to is a texture, with D3DUSAGE_RENDERTARGET, then it shouldn't be antialiased, regardless of the control panel setting. The back buffer may be, but if you perform a quad with point sampling, or a stretchrect() with point sampling, I don't see why it would be blurry...

Share this post


Link to post
Share on other sites
Malder1    122
SimmerD,
Thanks, but how to quickly show this texture with my image to the screen?
If I again use Direct3D to do it, it will blur image at this step.

Of course, I can read pixels of this texture and manually draw it via Bitblt to the screen, but it will not be a fast method.

Share this post


Link to post
Share on other sites
reltham    642
SimmerD: so you are saying that if I render my scene to a RenderTarget that I lose the quick Z cull and Z compression features of recent GPUs? Even if I make an RT that is the same size at the backbuffer and I don't change the Z/Stencil when I set my RT (so it uses the primary Z/Stencil created when the device is created/reset)?

That's unexpected.

Share this post


Link to post
Share on other sites
sirob    1181
If you're making a new RenderTarget without changing the z-buffer, if the original backbuffer was multisampled -- It'd be bigger than your current rendertarget. This would mean your color buffer and z/stencil buffer don't match in size, which wouldn't be a good idea, I think.

Share this post


Link to post
Share on other sites
Malder1    122
So there are no ways to prevent image blurring when antialiasing is enabled by user?

Or maybe anybody know how to ask Microsoft about this problem? I really still wondered. They suggested use 3D for 2D already in DirectX 8.0 and even after 3 years didn't solve problem with antialiasing in DirectX 9
I agree with AA when 3D mode, but in 2D our users even will not guess that blurring caused by their settings for 3D mode! We'll receive a lot of emails to our technical support.

Share this post


Link to post
Share on other sites
jollyjeffers    1570
Quote:
Original post by Malder1
So there are no ways to prevent image blurring when antialiasing is enabled by user?

Not really, the crucial part is that it's the **user** forcing you to use AA [smile]

Quote:
Original post by Malder1
Or maybe anybody know how to ask Microsoft about this problem?

I'm not really sure what they could say about it if you did ask (use the newsgroups and/or DIRECTXDEV mailing list if you want to try). As DrunkenHyena said earlier in the thread - it's not really their problem..

Quote:
Original post by Malder1
They suggested use 3D for 2D already in DirectX 8.0 and even after 3 years didn't solve problem with antialiasing in DirectX 9

In light of the previous comment(s), they don't really have a problem to solve!

Quote:
Original post by Malder1
We'll receive a lot of emails to our technical support.

Do you have a setup-readme, FAQ's or known-issues part of your package? this sort of thing belongs in here.

If you're really worried about it, you might either try what SimmerD said or you might be able to run some tests at load-time to try and analyze the quality of the image (and consequently warn the user that its "wrong")...

hth
Jack

Share this post


Link to post
Share on other sites
Malder1    122
Thanks for all responses! I really appreciate your help!
We'd like to create high quality multi-media application without such technical difficulties.

I only want to say my personal opinion. If earlier DirectDraw didn't have similar problem with quality, new DirectX Graphics should provide exactly same capability (especially if Microsoft says that it replaces old DirectDraw!). We really need to use more fast Direct3D for complex graphical effects with high quality of a image.

Earlier in DirectDraw there was no options to break quality of image. I believe that for 2D mode via 3D there should be same high accuracy of image and attention to overall quality. If Microsoft knows about this moments, why they don't want to solve it? I see very simple way - just add some flag to Direct3D initialization that we work with 2D over 3D and in this mode remove full-screen AA (except of AA of edges, optionally).

Share this post


Link to post
Share on other sites
Promit    13246
Quote:
Original post by SimmerD
4xs, 8xs and 16x modes under d3d are something that I came up with a few years ago in the gf4 timeframe. They do a combination of super and multisampling, so yes, they can blur things at times.


Sim, this is a bit OT I know, but were you responsible for these modes? Also, can you briefly describe what makes the xS and Quincunx modes different from the standard antialiasing modes? Lastly, is there any way you can enumerate and request these modes in an application?

Share this post


Link to post
Share on other sites
SimmerD    1210
Yes, here is the patent :

Clicky

There is no way to programmatically enable this modes. Unfortunately, these do have a bigger perf hit compared to normal multisampling, and we found that many games were just picking the largest AA mode available and running that, which would kill the framerate when running 8xS on, say a Geforce FX 5200.

So, we decided they would be a control-panel only feature.

Edited by Coder: Clickified patent link

[Edited by - Coder on July 1, 2005 9:57:49 PM]

Share this post


Link to post
Share on other sites
SimmerD    1210
Oh, the Q modes do a guassian blur on the image after the back buffer is rendered. This does blur the image, but cleans up the image a bit.

The xS modes do a combination of multi- and super-sampling to the same back & z buffers. They do NOT inherently blur the image, but if you are doing a linear filter with stretchrect() or a quad it can happen.

I just confirmed this with my own game engine, running 8xs vs no aa, and I see that minified text, for instance, does look a bit blurrier.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this