Direct3D9 & Voodoo Banshee - Transparency

Started by
4 comments, last by Tszafran 21 years ago
Hi, I am working on a project with a another person and its a 2D engine. It works fine on my system (creates the window and displays the FPS) but on his it creates the window and displays the FPS also, but the problem is the black behind the letters isnt transparent like on mine. He has a Voodoo Bashee AGP, does anyone know how to solve this? I am using Direct3d9 and D3DXSprite. This is the function I use to load the texture.
  

	D3DXCreateTextureFromFileEx(m_D3dDevice,
								Filename,
								256,
								256,
								D3DX_DEFAULT,
								D3DUSAGE_RENDERTARGET,
								D3DFMT_UNKNOWN,
								D3DPOOL_DEFAULT,
								D3DX_DEFAULT,
								D3DX_DEFAULT,
								ColorKey,
								NULL,
								NULL,
								&Sprite->m_Texture
								);

  
NOTE: the Colorkey D3DCOLOR and is set using this D3DCOLOR_XRGB(0,0,0); Thanks, any help would be greatly appeciated.
MSN Messenger: Thomas_Szafran@hotmail.com
Advertisement
Two possiblities.

As far as I know, a color key of 0 means you don''t want a color key, Use 0xFF000000, or D3DCOLOR_ARGB(255,0,0,0) to use black as a key.

You aren''t specifying a format. MS doesn''t really go into details on how they choose which format to load data in. If 24 bit it probably uses 32 bit X8R8G8B8. If 16 bit it probably uses R5G6B5, or X1R5G5B5. If the file has alpha it probably uses XR8G8B8, or A1R5G5B5, or A4R4G4B4. Does it force alpha to exist when using a colorkey? Probably, but I guess we can''t really say for sure. What if your card doesn''t support 16 bit with alpha (only R5G6B5), and it''s a 16 bit image? Does it promote the image to 32 bits to get alpha? Who knows.

You should test what modes are available for a given card, and load to the most appropriate one. Then again, it''s probably not this at all, and it''s the 0 color key, which makes it suprising it works on your machine.
Just checked... XRGB does in fact make a color of 255,0,0,0.

Check the surface description of the texture you get back to see what format it''s using, and ensure it does have alpha support.

Also, texture stage arguments are picky on some cards. Maybe you''re trying to use texture as arg2, and lots of early cards only support it as arg1.
The file is a 256x256 32bit bitmap file.

Also you mentioned something about texture stage arguments are picky on some cards? Can you explain that alittle more Im dont really understand.

Duno if this helps, but it's running in windowed mode, so the backbuffer format should be set to the current display correct? Would that make the textures the same format?

My draw code.


  	/*Draw sprite. */		m_SpriteRenderer->Draw(	Sprite->m_Texture, 								&Sprite->m_Rect[Frame],								&Sprite->m_Scaling,								&Sprite->m_RotCenter,								Sprite->m_Rotation, 								&Sprite->m_Translation, 								Sprite->m_ModulateColor);  


NOTE: ModulateColor D3DCOLOR and is set using this D3DCOLOR_XRGB(255,255,255);

[edited by - Tszafran on March 30, 2003 3:59:42 PM]
MSN Messenger: Thomas_Szafran@hotmail.com
Bump, I am sorry I had to do that (the bump). But this is very important, I might not be able to work on a team with my partner if we cant sort this out.

Any ideas?
MSN Messenger: Thomas_Szafran@hotmail.com
The best solution would be to have your friend get a GeForce 2 or better card. A Voodoo Banshee is just a tad bit dated. One can be picked up for about $30. Take a lookie here.
Manufacturing metaphores in my melancholy mind.

This topic is closed to new replies.

Advertisement