• Advertisement
Sign in to follow this  

anti-aliasing problem

This topic is 4307 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've got a DX9 app that I thought was working perfectly on a wide range of cards. However, if someone has nvidia hardware and uses that terrible "override app and force anti-aliasing on" feature in their control panel then the app gets a black screen. It doesn't seem to be one specific nvidia card either, i've reproduced it on FX5500, GF2 and GF3. The app does appear to be running and generates no errors, but the screen is black (back buffer clear colour). Strangley, if I force anti-aliasing on via the desktop control panel on any ATI card then everything works great. So I have what appears to be a perfectly functional game on all hardware except nvidia stuff when the user forces anti-aliasing on. Anyone ever seen anything like that before?

Share this post


Link to post
Share on other sites
Advertisement
I have seen strange behaviors with forced AA on an old DX7 application but so far not with a DX9 app. Do you have any operations that access the back buffer direct?

Share this post


Link to post
Share on other sites
Or do you use MRT at all? Since AA and MRT isn't supported at the same time - the calls might not be getting through at all.

Share this post


Link to post
Share on other sites
In the case you have the full code for this app you should try some things:
1. Change the clear color and check if the screen shows the color or still black.
2. Force AA inside the app and check what happened.
3. I assume you have already run it with the debug runtime and checked the output.

Share this post


Link to post
Share on other sites
1. done - it showed the colour I'd expect.
2. done - same behaviour as when it is forced on in the customer panel, no game errors reported.
3. yes, I get a few of the following warnings:

"Ignoring redundant SetTextureStageState"
"Ignoring redundant SetSamplerState"
"Ignoring redundant SetRenderState"

Which I assumed were unimportant, and one error:

"Unsupported independent write mask on device"

Not really sure what is causing that one or what it means.

Share this post


Link to post
Share on other sites
That is strange as independent write masks are only used together with MRTs or multi element textures.

Do you have any D3DRS_COLORWRITEENABLE, D3DRS_COLORWRITEENABLE1, D3DRS_COLORWRITEENABLE2 or D3DRS_COLORWRITEENABLE3 render state settings in your code?

Any SetRenderTargets? Maybe you have used a wrong index there.

Anyway you can try to set the “Break on Error” feature.

Share this post


Link to post
Share on other sites
Quote:
Do you have any D3DRS_COLORWRITEENABLE, D3DRS_COLORWRITEENABLE1, D3DRS_COLORWRITEENABLE2 or D3DRS_COLORWRITEENABLE3 render state settings in your code?


There is only the one reference to D3DRS_COLORWRITEENABLE which is to set it to the default 0x0f value.

Quote:
Any SetRenderTargets? Maybe you have used a wrong index there.


I do have a setRenderTarget call, but that only kicks in part way through the game when the player does something specific. Initially it just defaults to the current viewport and the anti-aliasing blackscreen bug is there right from the start.

Quote:
Anyway you can try to set the “Break on Error” feature.


I've tried using the "break on error" feature, which normally works ok, but in the case of the write mask error I don't get a full call stack, it stops inside ntdll.dll and I can only back up as far as d3d9d.dll.


I also tried downloading the latest DX sdk as I was still on the xmas release, but it made no difference.

Share this post


Link to post
Share on other sites
Quote:
Original post by Poo Bear
I've tried using the "break on error" feature, which normally works ok, but in the case of the write mask error I don't get a full call stack, it stops inside ntdll.dll and I can only back up as far as d3d9d.dll.


Have you tried it the other way? Stepping over the D3D instructions until the Error message is shown?

Share this post


Link to post
Share on other sites
Ah right, there was some other code trying to set default values on colourwriteenable, but they were setting it to 0x00 when it should have been 0x0f. Silly mistake.


Anyway...


I now have no DX reported errors whatsoever (which is nice). I still have the same anti-aliasing bug though - curses :)

Share this post


Link to post
Share on other sites
I am official out of ideas. The last thing you can try is refrast. If it works you should contact the developer support from nVidia because there is a high chance that you have find a driver bug.

Share this post


Link to post
Share on other sites
This is the device setup code:

D3DPRESENT_PARAMETERS presentParams;
memset( &presentParams, 0x00, sizeof(presentParams) );
m_newRVars.BackBufferWidth = width;
m_newRVars.BackBufferHeight = height;
m_newRVars.BackBufferFormat = D3DFMT_R5G6B5;
m_newRVars.hDeviceWindow = m_hWnd;
m_newRVars.Windowed = TRUE;
m_newRVars.SwapEffect = D3DSWAPEFFECT_DISCARD;
m_pD3D->CreateDevice( D3DADAPTER_DEFAULT, D3DDEVTYPE_REF, m_hWnd, D3DCREATE_HARDWARE_VERTEXPROCESSING | D3DCREATE_PUREDEVICE, &m_newRVars, &m_pDevice ) ) )

Now that works ok on the ref rast, but if I add this line to get anti-aliasing on:

m_newRVars.MultiSampleType = D3DMULTISAMPLE_2_SAMPLES;

then it seems to crash inside CreateDevice.

Share this post


Link to post
Share on other sites
The reference rasterizer only supports 4x and 9x multisampling (in addition to the non-masked modes).

Share this post


Link to post
Share on other sites
Ah thank you. Ok the ref rast now works with anti-aliasing activated and the same problem is reproduced, no errors but a black screen. So it isn't an nvidia driver bug, its something in my code and it just happens to work ok on non-nvidia hardware. :(

Share this post


Link to post
Share on other sites
Try using a 32 bit format instead of D3DFMT_R5G6B5. Dunno if that'll do anything.

Share this post


Link to post
Share on other sites
As I don’t know your render loop I can not say if it can be done but you should try to divide it in parts to test which of them caused this problem.

Do you have a function that sets all renderstates to a default value after device creation?

Share this post


Link to post
Share on other sites
Give that man a cigar!

I removed my setDefaultRenderStates code, everything went a bit crazy as you would imagine, but with anti-aliasing on it did actually draw something. Took a while to sift through all 102 default states, but I found the culprit.

Set( D3DRS_MULTISAMPLEMASK, 0x00 );

this was just a daft error, the default state should be as below:

Set( D3DRS_MULTISAMPLEMASK, 0xFFFFFFFF );


Strange that it only seems to cause a problem in the ref rast and on nvidia boards but at least its fixed.


Thanks everyone :)

Share this post


Link to post
Share on other sites
Quote:
Original post by Poo Bear
Strange that it only seems to cause a problem in the ref rast and on nvidia boards but at least its fixed.
Well that would indicate that ATI were the bad ones - The RefRast is your benchmark for what is (and isn't) correct [wink]

The ATI drivers are either being nice and guessing that you didn't want to set that render state the way you did, and it changed it or possibly it's just ignoring that renderstate entirely.

From my recent experiences, the ATI drivers do seem to be a lot more tolerant of erronous parameters and render states than Nvidia hardware.

Cheers,
Jack

Share this post


Link to post
Share on other sites
Quote:
Original post by Poo Bear
Give that man a cigar!

I removed my setDefaultRenderStates code, everything went a bit crazy as you would imagine, but with anti-aliasing on it did actually draw something. Took a while to sift through all 102 default states, but I found the culprit.

Set( D3DRS_MULTISAMPLEMASK, 0x00 );

this was just a daft error, the default state should be as below:

Set( D3DRS_MULTISAMPLEMASK, 0xFFFFFFFF );


Strange that it only seems to cause a problem in the ref rast and on nvidia boards but at least its fixed.


Thanks everyone :)


Sounds like ATI does interpret the 0 mask in a wrong way.

PS: But nVidias driver is not clean too because if you don’t force AA inside the app it should ignore the multisample mask.

Share this post


Link to post
Share on other sites
Quote:
Original post by Demirug
Quote:
Original post by Poo Bear
Give that man a cigar!

I removed my setDefaultRenderStates code, everything went a bit crazy as you would imagine, but with anti-aliasing on it did actually draw something. Took a while to sift through all 102 default states, but I found the culprit.

Set( D3DRS_MULTISAMPLEMASK, 0x00 );

this was just a daft error, the default state should be as below:

Set( D3DRS_MULTISAMPLEMASK, 0xFFFFFFFF );


Strange that it only seems to cause a problem in the ref rast and on nvidia boards but at least its fixed.


Thanks everyone :)


Sounds like ATI does interpret the 0 mask in a wrong way.

PS: But nVidias driver is not clean too because if you don’t force AA inside the app it should ignore the multisample mask.



Quote:
From DirectX SDK
This render state has no effect when rendering to a single sample buffer.


But when the user forces antialiasing, the rendering does effectively happen to multiple sample buffers. This is a matter of interpretation of the specification.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement