Archived

This topic is now archived and is closed to further replies.

Fald

Texture gets corrupted after rendering (filtering?)

Recommended Posts

Fald    122
Hello I got a problem with the rendering of my GUI textures. This is what it should look like: http://217.208.149.163/errortest.dds This is what it looks like: (notice the two top corners) http://217.208.149.163/ingame.jpg The texture is 256x256 pixels, saved as DXT5. I've tried a lot of things to get the image to appear as it should... Turning off filtering, setting filtering to POINT, subtracting 0.5f from the vertex coordinates... adding 0.5, removing/adding texel size (and half texel size) to the UV coordinates.. just about everything I could think of. Here are the vertex and UV coordinates.
	AttributeData.push_back(TEXTURE->LoadTexture("errortest.dds"));
	VertexData.push_back(CV(0.0f, 0.0f, 0.0f, 0.0f, 0.0f));
	VertexData.push_back(CV(256.0f, 0.0f, 0.0f, 1.0f, 0.0f));
	VertexData.push_back(CV(0.0f, -256.0f, 0.0f, 0.0f, 1.0f));

	AttributeData.push_back(TEXTURE->LoadTexture("errortest.dds"));
	VertexData.push_back(CV(256.0f, 0.0f, 0.0f, 1.0f, 0.0f));
	VertexData.push_back(CV(256.0f, -256.0f, 0.0f, 1.0f, 1.0f));
	VertexData.push_back(CV(0.0f, -256.0f, 0.0f, 0.0f, 1.0f));
Those are then thrown into an LPD3DXMESH... NOTE: Those coordinates are untransformed. I render them with these matrices:
	// Projection matrix for ortographic view

	D3DXMatrixOrthoLH(&dxOrtMatrix, fScreenWidth, fScreenHeight, -10.0f, 10.0f);
	// View matrix for ortographic view

	D3DXMatrixLookAtLH(&dxOrtCamera, &D3DXVECTOR3(0.0f, 0.0f, -5.0f),
								     &D3DXVECTOR3(0.0f, 0.0f, 0.0f),
									 &D3DXVECTOR3(0.0f, 1.0f, 0.0f));
This is the code I use for rendering: (Note: I removed the texture filter code since it didnt do any good)
	// Turn off lightning

	STATE->SetRenderState(D3DRS_LIGHTING, FALSE);
	// Set up alpha blending

	STATE->SetRenderState(D3DRS_ALPHABLENDENABLE, true);
	STATE->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
	STATE->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);

	// Render the GUI.

	if(pdxMesh != NULL)
	{
		for( DWORD i = 0; i != dwNumSubsets; i++ )
		{
			TEXTURE->SetTexture(iSubset[i]);
			pdxMesh->DrawSubset(iSubset[i]);
		} 
	}

	// Turn lightning back on

	STATE->SetRenderState(D3DRS_LIGHTING, TRUE);
Any ideas? EDIT: Fixed source tags [edited by - Fald on January 23, 2004 11:13:59 AM]

Share this post


Link to post
Share on other sites
Fidelio66    164
Isn''t it device->setrenderstate and device->settexture ?

Maybe STATE and TEXTURE are #defines for your device, but this way you are obscuring what is really happening.

Share this post


Link to post
Share on other sites
Fald    122
Oh, sorry..
those are my state and texture managers....

They are 100% working

State manager checks for redundant render states..
Texture manager load files and sends them to the card when/if they are needed. It also makes sure I dont have 2 copies of the same texture in memory

[edited by - Fald on January 23, 2004 10:48:07 AM]

Share this post


Link to post
Share on other sites
Fald    122
It looks like I''ve managed to solve my problem... but the solution confuses me, a lot, so I''d like an explanation

It seems to be a problem with my choice of texture format...

Using A8R8G8B8 fixed it... A4R4G4B4 works too, so I did some more testing.

DXT* produce artifacts, anyone know why?

Share this post


Link to post
Share on other sites
don    431
It''s a lossy format. DXT5 uses interpolated alpha values from a look-up table. You might try DXT2 as it uses 4 bit alpha that isn''t interpolated. Since you say that A4R4G4B4 works, DXT2 will probably work OK. The colors will still come from a table of interpolated values (lossy) but you don''t seem to be having a problem with color data in the images you''ve posted.

The formats are described in detail in the DX SDK docs. Look for the section titled "Compressed Texture Resources".

Share this post


Link to post
Share on other sites