How application can forcedly disable antialiasing?

Started by
21 comments, last by SimmerD 18 years, 9 months ago
We're developing 2D like application using Direct3D. All is OK and textures are very sharp. But when user set several kinds of full-scene antialiasing (supersampling)- Quincix, 4xS, 8x, 16x (on NVIDIA video cards), it blures even entire texture! Is there a way how to be able disable any antialiasing in my application? I've tried d3dpp -> MultiSampleType := D3DMULTISAMPLE_NONE but it doesn't help.
Advertisement
Sadly there isn't an easy way to do this, if the user (via the NVidia/ATI control panel) choose to "force AA" then thats whats gonna happen - the driver has the final say in the matter [headshake]

It is, theoretically, possible to "hack" the drivers and force/check various settings, but thats rarely documented and a pretty grey area (e.g. driver revisions, as well as should you override what the user wants?).

Your best bet is probably just to make a note that "This game is not compatible with ___ forms of anti-aliasing, we recommend you disable them whilst playing"

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Thanks for your response. I really wondered, because Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?
Quote:Original post by Malder1
Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?

Not quite sure I get you here... you wondering why they recommend using D3D when you get quality issues like you're describing?

It's a debatable topic, but the drivers that allow a user to force particular settings should make it quite clear that they are doing just that - forcing an application to behave in a particular way. In forcing something they have to accept that it might not work.

If the user decides they want to go changing things like that, and it breaks I (as a software developer) am inclined to say - "you took the responsibility, it's your problem now". But the customer is never likely to see it that way - it's my code thats broken, not them [headshake]

Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Quote:Original post by Malder1
Thanks for your response. I really wondered, because Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?


Just to be clear this isn't an issue with D3D, it's an issue with the driver. Those driver control panels let users do far worse things than just force multisampling.

Make sure your product comes with a FAQ and make sure this is covered prominently.
Stay Casual,KenDrunken Hyena
The easiest way to disable AA under d3d is to render the entire scene to a texture instead of the back buffer.

This can have other negative consequences, like lack of Z cull or z compression, but it shouldn't matter for a 2d game.
Thanks for information!

Really this problem doesn't matter in 3D application, where user decides about quality and parameters of a game. But we develope 2D multimedia application and user even will not imagine that it works via 3D mode. And I'm afraid he will not guess that it's necessary to temporary turn off AA.

SimmerD,
Thanks for nice idea!
But when we will have ready texture how to put it to the screen? If I add additional BeginScene/EndScene with DrawPrimitive this final texture will be rendered with antialiasing to the screen?
IIRC anti-aliasing has nothing to do with the textures. If you render your scene to a texture, set the texture filter to D3DTEXF_POINT (no guarantee for that, but should be something like that), and render a quad that fits your screen, just the edges of the quad should be anti-aliased, not the texture itself, so this should do the trick.
matches81,
It doesn't help. I've tried to show 512x512 texture to a window 512x512. Each texel exactly maped to screen pixel.
But when I enable *some modes* of full-screen antialiasing (for example, 8x, 16x, Quincix) it blurres *all texture*, not only edges.
However no problem with AA 4x, 2x which blurred edges and nothing more.
And you are sure it doesn't have anything to do with mipmapping. (common problem to forget to turn of mipmapping for 2D)
Sounds strange, but blur from AA sounds even more strange.


This topic is closed to new replies.

Advertisement