anti-aliasing problem
I am official out of ideas. The last thing you can try is refrast. If it works you should contact the developer support from nVidia because there is a high chance that you have find a driver bug.
This is the device setup code:
D3DPRESENT_PARAMETERS presentParams;
memset( &presentParams, 0x00, sizeof(presentParams) );
m_newRVars.BackBufferWidth = width;
m_newRVars.BackBufferHeight = height;
m_newRVars.BackBufferFormat = D3DFMT_R5G6B5;
m_newRVars.hDeviceWindow = m_hWnd;
m_newRVars.Windowed = TRUE;
m_newRVars.SwapEffect = D3DSWAPEFFECT_DISCARD;
m_pD3D->CreateDevice( D3DADAPTER_DEFAULT, D3DDEVTYPE_REF, m_hWnd, D3DCREATE_HARDWARE_VERTEXPROCESSING | D3DCREATE_PUREDEVICE, &m_newRVars, &m_pDevice ) ) )
Now that works ok on the ref rast, but if I add this line to get anti-aliasing on:
m_newRVars.MultiSampleType = D3DMULTISAMPLE_2_SAMPLES;
then it seems to crash inside CreateDevice.
D3DPRESENT_PARAMETERS presentParams;
memset( &presentParams, 0x00, sizeof(presentParams) );
m_newRVars.BackBufferWidth = width;
m_newRVars.BackBufferHeight = height;
m_newRVars.BackBufferFormat = D3DFMT_R5G6B5;
m_newRVars.hDeviceWindow = m_hWnd;
m_newRVars.Windowed = TRUE;
m_newRVars.SwapEffect = D3DSWAPEFFECT_DISCARD;
m_pD3D->CreateDevice( D3DADAPTER_DEFAULT, D3DDEVTYPE_REF, m_hWnd, D3DCREATE_HARDWARE_VERTEXPROCESSING | D3DCREATE_PUREDEVICE, &m_newRVars, &m_pDevice ) ) )
Now that works ok on the ref rast, but if I add this line to get anti-aliasing on:
m_newRVars.MultiSampleType = D3DMULTISAMPLE_2_SAMPLES;
then it seems to crash inside CreateDevice.
The reference rasterizer only supports 4x and 9x multisampling (in addition to the non-masked modes).
Ah thank you. Ok the ref rast now works with anti-aliasing activated and the same problem is reproduced, no errors but a black screen. So it isn't an nvidia driver bug, its something in my code and it just happens to work ok on non-nvidia hardware. :(
As I don’t know your render loop I can not say if it can be done but you should try to divide it in parts to test which of them caused this problem.
Do you have a function that sets all renderstates to a default value after device creation?
Do you have a function that sets all renderstates to a default value after device creation?
Give that man a cigar!
I removed my setDefaultRenderStates code, everything went a bit crazy as you would imagine, but with anti-aliasing on it did actually draw something. Took a while to sift through all 102 default states, but I found the culprit.
Set( D3DRS_MULTISAMPLEMASK, 0x00 );
this was just a daft error, the default state should be as below:
Set( D3DRS_MULTISAMPLEMASK, 0xFFFFFFFF );
Strange that it only seems to cause a problem in the ref rast and on nvidia boards but at least its fixed.
Thanks everyone :)
I removed my setDefaultRenderStates code, everything went a bit crazy as you would imagine, but with anti-aliasing on it did actually draw something. Took a while to sift through all 102 default states, but I found the culprit.
Set( D3DRS_MULTISAMPLEMASK, 0x00 );
this was just a daft error, the default state should be as below:
Set( D3DRS_MULTISAMPLEMASK, 0xFFFFFFFF );
Strange that it only seems to cause a problem in the ref rast and on nvidia boards but at least its fixed.
Thanks everyone :)
Quote:Original post by Poo BearWell that would indicate that ATI were the bad ones - The RefRast is your benchmark for what is (and isn't) correct [wink]
Strange that it only seems to cause a problem in the ref rast and on nvidia boards but at least its fixed.
The ATI drivers are either being nice and guessing that you didn't want to set that render state the way you did, and it changed it or possibly it's just ignoring that renderstate entirely.
From my recent experiences, the ATI drivers do seem to be a lot more tolerant of erronous parameters and render states than Nvidia hardware.
Cheers,
Jack
Quote:Original post by Poo Bear
Give that man a cigar!
I removed my setDefaultRenderStates code, everything went a bit crazy as you would imagine, but with anti-aliasing on it did actually draw something. Took a while to sift through all 102 default states, but I found the culprit.
Set( D3DRS_MULTISAMPLEMASK, 0x00 );
this was just a daft error, the default state should be as below:
Set( D3DRS_MULTISAMPLEMASK, 0xFFFFFFFF );
Strange that it only seems to cause a problem in the ref rast and on nvidia boards but at least its fixed.
Thanks everyone :)
Sounds like ATI does interpret the 0 mask in a wrong way.
PS: But nVidias driver is not clean too because if you don’t force AA inside the app it should ignore the multisample mask.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement