Jump to content
  • Advertisement
Sign in to follow this  
Malder1

How application can forcedly disable antialiasing?

This topic is 4766 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

We're developing 2D like application using Direct3D. All is OK and textures are very sharp. But when user set several kinds of full-scene antialiasing (supersampling)- Quincix, 4xS, 8x, 16x (on NVIDIA video cards), it blures even entire texture! Is there a way how to be able disable any antialiasing in my application? I've tried d3dpp -> MultiSampleType := D3DMULTISAMPLE_NONE but it doesn't help.

Share this post


Link to post
Share on other sites
Advertisement
Sadly there isn't an easy way to do this, if the user (via the NVidia/ATI control panel) choose to "force AA" then thats whats gonna happen - the driver has the final say in the matter [headshake]

It is, theoretically, possible to "hack" the drivers and force/check various settings, but thats rarely documented and a pretty grey area (e.g. driver revisions, as well as should you override what the user wants?).

Your best bet is probably just to make a note that "This game is not compatible with ___ forms of anti-aliasing, we recommend you disable them whilst playing"

hth
Jack

Share this post


Link to post
Share on other sites
Thanks for your response. I really wondered, because Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?

Share this post


Link to post
Share on other sites
Quote:
Original post by Malder1
Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?

Not quite sure I get you here... you wondering why they recommend using D3D when you get quality issues like you're describing?

It's a debatable topic, but the drivers that allow a user to force particular settings should make it quite clear that they are doing just that - forcing an application to behave in a particular way. In forcing something they have to accept that it might not work.

If the user decides they want to go changing things like that, and it breaks I (as a software developer) am inclined to say - "you took the responsibility, it's your problem now". But the customer is never likely to see it that way - it's my code thats broken, not them [headshake]

Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by Malder1
Thanks for your response. I really wondered, because Microsoft recommends to use Direct3D for 2D applications now. But what "quality"?


Just to be clear this isn't an issue with D3D, it's an issue with the driver. Those driver control panels let users do far worse things than just force multisampling.

Make sure your product comes with a FAQ and make sure this is covered prominently.

Share this post


Link to post
Share on other sites
The easiest way to disable AA under d3d is to render the entire scene to a texture instead of the back buffer.

This can have other negative consequences, like lack of Z cull or z compression, but it shouldn't matter for a 2d game.

Share this post


Link to post
Share on other sites
Thanks for information!

Really this problem doesn't matter in 3D application, where user decides about quality and parameters of a game. But we develope 2D multimedia application and user even will not imagine that it works via 3D mode. And I'm afraid he will not guess that it's necessary to temporary turn off AA.

SimmerD,
Thanks for nice idea!
But when we will have ready texture how to put it to the screen? If I add additional BeginScene/EndScene with DrawPrimitive this final texture will be rendered with antialiasing to the screen?

Share this post


Link to post
Share on other sites
IIRC anti-aliasing has nothing to do with the textures. If you render your scene to a texture, set the texture filter to D3DTEXF_POINT (no guarantee for that, but should be something like that), and render a quad that fits your screen, just the edges of the quad should be anti-aliased, not the texture itself, so this should do the trick.

Share this post


Link to post
Share on other sites
matches81,
It doesn't help. I've tried to show 512x512 texture to a window 512x512. Each texel exactly maped to screen pixel.
But when I enable *some modes* of full-screen antialiasing (for example, 8x, 16x, Quincix) it blurres *all texture*, not only edges.
However no problem with AA 4x, 2x which blurred edges and nothing more.

Share this post


Link to post
Share on other sites
And you are sure it doesn't have anything to do with mipmapping. (common problem to forget to turn of mipmapping for 2D)
Sounds strange, but blur from AA sounds even more strange.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!