[SOLVED] [DX9] Texure Nvidia Problem (Ati is "fine")

Started by
4 comments, last by Zoner 13 years, 3 months ago
Hi, I have a problem that I don't know how to solve and maybe you could give me a hand.


I have a deferred shading system that is working nicely with my Ati card, but when I tired to run it in other computer, if it has an Ati card all works fine, but if it's a nvidia one all I have is a black screen. :S

I was confident that developing my engine on an Ati card that kind of problems would not apear.. Silly me! :)

Using the D3D Debug mode I saw that the main problem was that I had just one stencil buffer for all my render targets. Creating one for each RT the back screen disapeared and all my meshes apeared! but, and here comes the actual problem, the result image was not exactly as I spected.

In my deferred system I draw the result of the final rendering in another RT and later I use its texture to be drawn in a full screen quad.

I use the same resolution in all the render targets I use and also on my window screen.

Here is the issue, when I run it on an Ati card all works fine and clear except that I get a diagonal across the image that breaks it a little bit. But I think I could live with that.
But when I run it on an Nvidia card, the image is "scaled". Is as if the image was in another resolution and the system scaled it to fit in my fullscreen quad, deforming it a little bit, but enought to make some pixels twice big as it should be, artifacting the scene.

Testing different situations I've seen that if I use a 1:1 resolution like 800x800 in all my RT and in the window size, all appears clear as it shuld be! Ati is not breaking the image with the diagonal, and Nvidia's image is cristal clear.

D3D does not show any warning or error and I do not know how to approach that problem.

I've looked arround but it was worthless. Is there any post compounding all difeerences between Ati and Nvidia card in order to program on them? Because searching I allways found that ati is more strict (not in this case), the usual ps3.0 vs3.0 error, or the DrawPrimitive() call differences!

Any Ideas guys?
Thanks!
Advertisement
Hi there. It may just be a power of two thing with one or all of your textures, go and see if all the images you use are power of 2.
There should be at least one of these problems:

* as ankhd suggested, non-power-of-2 sized textures.
* different bit depth RTs (i.e. ARGB32 for diffuse map, R32F for depth map, ARGB16 for material map, ARGB8 for another). All the bit-depths of RTs must be the same.

hth.
-R
There's no "hard", and "the impossible" takes just a little time.
Here is an image of the diagonal artifact on Ati's.


problem1b.th.jpg



Now I'm sure that's not OK at all.
It's interesting that the font I use to debbug (ID3DXFont) is also deformed.

I render the scene in a Render Target of 1280x720 and then I draw the resultant texture in a full screen quad. The window is at the same resolution (1280x720).
I'm sure it's not a Render Target issue because I've saved this resultant texture on disc and it's ok. The deformation must occur when I draw it in the full screen quad or maybe later.

Here is my full screen quad code:



void drawFullScreenQuad( LPDIRECT3DTEXTURE9 texture)
{
// turn on the 3D lighting
d3ddev->SetRenderState(D3DRS_LIGHTING, FALSE);

d3ddev->SetTexture( 0, texture );

D3DXMATRIX m;
D3DXMatrixIdentity( &m );
d3ddev->SetTransform( D3DTS_WORLD, &m );
d3ddev->SetTransform( D3DTS_VIEW, &m );
D3DXMatrixOrthoOffCenterRH( &m, -1, 1, -1, 1, -1, 1 );

d3ddev->SetTransform( D3DTS_PROJECTION, &m );

// Disable ztest & zwrite
d3ddev->SetRenderState( D3DRS_ZENABLE, FALSE );
d3ddev->SetRenderState( D3DRS_ZWRITEENABLE, FALSE );

// --------------------------------
float left = -1.0f;
float right = 1.0f;
float top = -1.0f;
float bottom = 1.0f;

// Stripped quad
TEXTUREVERTEX vtxs[ 4 ];
vtxs[0].x = left; vtxs[0].y = top; vtxs[0].u = 0.0f; vtxs[0].v = 1.0f;
vtxs[1].x = left; vtxs[1].y = bottom; vtxs[1].u = 0.0f; vtxs[1].v = 0.0f;
vtxs[2].x = right; vtxs[2].y = top; vtxs[2].u = 1.0f; vtxs[2].v = 1.0f;
vtxs[3].x = right; vtxs[3].y = bottom; vtxs[3].u = 1.0f; vtxs[3].v = 0.0f;
vtxs[0].z = vtxs[1].z = vtxs[2].z = vtxs[3].z = 0.0f;

d3ddev->SetFVF( TEXTUREVERTEX::getFVF( ) );
d3ddev->DrawPrimitiveUP( D3DPT_TRIANGLESTRIP, 2, vtxs, sizeof( TEXTUREVERTEX ) );

d3ddev->SetRenderState( D3DRS_ZENABLE, TRUE );
d3ddev->SetRenderState( D3DRS_ZWRITEENABLE, TRUE );

d3ddev->SetTexture( 0, NULL );

// turn on the 3D lighting
d3ddev->SetRenderState(D3DRS_LIGHTING, TRUE);
}




And this is my drawing code:




//Render all geometry
CScenarioManager::getInstance()->render();

d3ddev->Clear( 0, NULL, D3DCLEAR_TARGET|D3DCLEAR_ZBUFFER, D3DCOLOR_XRGB( 0, 0, 0), 1.0f, 0 );

// begins the 3D scene
HR(d3ddev->BeginScene());

// Render texture
drawFullScreenQuad(R2T.getTexture());

// This part is the text that is also deformed
textDebbugInfo->render();

// ends the 3D scene
HR(d3ddev->EndScene());

// displays the created frame
d3ddev->Present(NULL, NULL, NULL, NULL);




I have no image of the Nvidia results at the moment but tomorrow I'll add it. The diagonal artifact is not appearing, but the resultant image is like an "scaled one" on Photoshop.

Even If I use a 1:1 resolution like 800x800 in the windows size, and the Render Target Sizes, the results are the same:

problem2m.th.jpg

Any ideas of what's going on here?

Hi there. It may just be a power of two thing with one or all of your textures, go and see if all the images you use are power of 2.


ohmy.gifohmy.gif Uo!

That's it! laugh.giflaugh.giflaugh.giflaugh.gif

How can affect in such a thing the power of 2 of the textures!? I know that po2 textures are faster and more optimal to use than the rest, but why other kind of resolutions generate such a problem? I would never figure it out.

THANKS A LOT to both of you. :)

Besides I'm happy knowing what happened, now I have a derivate problem.
If I'm obligated to use power of 2 textures, I should use 512x512, 1024x1024, 2048x2048.
(EDIT: I've seen that I can use this 2048x256 kind of convinations also)

512 or 1024 are too small in terms of quality to store a resulting image that has to be showed on a window of 1280x720. :(
So 2048 is the better choice, but it slows down the frame rate significantly.

Also using any resolution smaller than the window resolution presents this issue:

problem3.th.jpg


The window is not fully filled with the textured quad for Render Target resolutions smallers than the window resolution.
Don't should be filled the whole quad beacause I'm using 0.0 to 1.1 UV coords?

Here is an image of the diagonal artifact on Ati's.


problem1b.th.jpg



Any ideas of what's going on here?


Another possiblity: (even though you have a fix above)

This diagonal pattern is a hallmark of having a shader refer to one of its texture samplers which is aliasing the same memory you are rendering into. You should make a copy of the rendertarget texture and use that copy as the texture here, it should fix your problem. This is also known as 'doing a resolve', which can mean a few things. Resolve is primarily the act of moving a copy of a rendertarget into a texture, which on the XBOX 360 is required since rendertargets are not addressable as textures (EDRAM and all that), and resolve can also mean to 'take this MSAA buffer and make it a normal non AA texture now'.

Fetching and rendering to self can work in special cases but its hardware specific what happens when it breaks. Primarily if the fetch is not a perfect 1:1 of the write back out it manifests as the diagonal tear you have in your screenshot (on NVIDIA hardware IIRC)
http://www.gearboxsoftware.com/

This topic is closed to new replies.

Advertisement