Direct3D in 2D Texture Problem

Started by
4 comments, last by Normie 22 years, 9 months ago
Ok, I was following the 2D rendering in D3D article on this site. After a little tweaking of MSVC (and eventually getting Visual Assist...gotta love them tooltips!), I got the whole thing to work, all the way to the end of the article. There's just one problem: for the texture, I used a monochrome bitmap with a white background and the name Josh in cursive in black (lack of inspiration?). The bitmap renders on the panel with a red background and black lettering. I'm running in 16 bit, so there shouldn't be a palette problem...any suggestions? [EDIT] Oh yeah, here's the snippet of code where the "sprite" and texture are loaded; this is the only place I can see a problem potentially cropping up (it's basically the PostInit() from the article): void PostInit(float fWindowWidth, float fWindowHeight) { D3DXMATRIX Ortho2D; D3DXMATRIX Identity; PANELVERTEX* pVertices = NULL; D3DXMatrixOrthoLH(&Ortho2D, fWindowWidth, fWindowHeight, 0, 1); D3DXMatrixIdentity(&Identity); g_lpD3DDevice->SetTransform(D3DTS_PROJECTION, &Ortho2D); g_lpD3DDevice->SetTransform(D3DTS_WORLD, &Identity); g_lpD3DDevice->SetTransform(D3DTS_VIEW, &Identity); float PanelWidth = .5, PanelHeight = .5; g_lpD3DDevice->CreateVertexBuffer(4 * sizeof(PANELVERTEX), D3DUSAGE_WRITEONLY, D3DFVF_PANELVERTEX, D3DPOOL_MANAGED, &g_lpD3DVertexBuffer); g_lpD3DVertexBuffer->Lock(NULL, 4*sizeof(PANELVERTEX), (byte**)&pVertices, D3DLOCK_NOSYSLOCK | D3DLOCK_DISCARD); pVertices[0].color = pVertices[1].color = pVertices[2].color = pVertices[3].color = D3DCOLOR_XRGB(255, 0, 0); pVertices[0].x = pVertices[3].x = -PanelWidth / 2.0f; pVertices[1].x = pVertices[2].x = PanelWidth / 2.0f; pVertices[0].y = pVertices[1].y = PanelHeight / 2.0f; pVertices[2].y = pVertices[3].y = -PanelHeight / 2.0f; pVertices[0].z = pVertices[1].z = pVertices[2].z = pVertices[3].z = 1.0f; pVertices[1].u = pVertices[2].u = 1.0f; pVertices[0].u = pVertices[3].u = 0.0f; pVertices[0].v = pVertices[1].v = 0.0f; pVertices[2].v = pVertices[3].v = 1.0f; g_lpD3DVertexBuffer->Unlock(); g_lpD3DDevice->SetRenderState(D3DRS_LIGHTING, false); D3DXCreateTextureFromFileEx(g_lpD3DDevice, "josh.bmp", NULL, NULL, NULL, NULL, D3DFMT_A8R8G8B8, D3DPOOL_MANAGED, D3DX_DEFAULT, D3DX_DEFAULT, NULL, NULL, NULL, &g_lpD3DTexture); g_lpD3DDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE); g_lpD3DDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA); g_lpD3DDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA); g_lpD3DDevice->SetTextureStageState(0, D3DTSS_ALPHAOP, D3DTOP_MODULATE); } -Normie "But time flows like a river... and history repeats." ... So...what about beaver dams? Edited by - normie on July 24, 2001 2:18:02 PM Edited by - normie on July 24, 2001 2:20:29 PM
I am a devout follower of the"Lazy Programmer's Doctrime"(tm)...and I'm damned proud of it, too!-----"I came, I saw, I started makinggames." ... If you'll excuseme, I must resume my searchfor my long lost lobotomy stitches.
Advertisement
You''re setting the color of each vertex to 255,0,0 (red) - therefore when the quad is rendered, the texture will be blended with a red tint.. try changing

pVertices[0].color = pVertices[1].color = pVertices[2].color = pVertices[3].color = D3DCOLOR_XRGB(255, 0, 0);

with

pVertices[0].color = pVertices[1].color = pVertices[2].color = pVertices[3].color = D3DCOLOR_XRGB(255, 255, 255);

good luck...

p.s. the 255,255,255 if you didn''t already know is white... so umm the texture will be blended with a white tint which will make it look normal

hey, that''s great, thanks for the help...one question, though: why doesn''t it just take the white background from the bmp when applying the texture? is there a way to do that? or, perhaps, is something wrong with the bmp? (just so you know, I made it with plain old Windows Paint... 64x64, nothing changed except the black lettering)

-Normie

"But time flows like a river... and history repeats."
...
So...what about beaver dams?
I am a devout follower of the"Lazy Programmer's Doctrime"(tm)...and I'm damned proud of it, too!-----"I came, I saw, I started makinggames." ... If you'll excuseme, I must resume my searchfor my long lost lobotomy stitches.
Ok, you need to set up a colorkey for the texture (well technically its not a colorkey, its an alpha channel, but its called a colorkey when you create the texture, oops - back to the point ).. anyway, when you create the texture you can specify a color value to be transparent.. try this:

D3DXCreateTextureFromFileEx(g_lpD3DDevice,
"josh.bmp",
NULL,
NULL,
NULL,
NULL,
D3DFMT_A8R8G8B8,
D3DPOOL_MANAGED,
D3DX_DEFAULT,
D3DX_DEFAULT ,
0xFFFFFFFF, //colorkey value
NULL,
NULL,
&g_lpD3DTexture);

umm short explanation - colorkey value is in the form 0xaarrggbb - aa being the alpha value, or how transparent it is (ff = 255 therefore fully clear), rr = red, gg = green and bb=blue.. so... 0xffffffff makes all white areas of the image fully clear, 0xff00ff00 would make all green areas fully clear.. umm and finally 0x80ffffff would make all white areas semi-transparent (i think)...

anyway, good luck...
oh one other thing - in my code (which i based on the same tutorial on this site btw) ive commented out the line:

lpD3DDevice->SetTextureStageState(0, D3DTSS_ALPHAOP, D3DTOP_BLENDTEXTUREALPHA);

hehe, but i cant remember if it had anything to do with the transparency or not (im not very good with d3d, faar better with ogl) umm anyway, if it doesnt work with that line in, try commenting it out..

again, good luck...
well, damn, that makes a lot of sense...
few more tests and experiments and I can work on programming some wrappers of my own...or maybe i''ll just fall back on ID3DXSprite.

thanks for the help :D

-Normie
I am a devout follower of the"Lazy Programmer's Doctrime"(tm)...and I'm damned proud of it, too!-----"I came, I saw, I started makinggames." ... If you'll excuseme, I must resume my searchfor my long lost lobotomy stitches.

This topic is closed to new replies.

Advertisement