Archived

This topic is now archived and is closed to further replies.

Kylotan

D3D7 Texturing woes

Recommended Posts

Sorry for the undescriptive subject; I''m at a loss for how to describe some of the issues I come across. And yes, I know I''m now 2 whole versions of DirectX behind. I want to finish this project rather than keep restarting when a new API comes out. Not to mention that I want this to run on Cyrix processors, which DX8 doesn''t, AFAIK. Anyway... Problem one: These surface creation caps for my textures work: dwCaps = DDSCAPS_TEXTURE | DDSCAPS_WRITEONLY; dwCaps2 = DDSCAPS2_D3DTEXTUREMANAGE | DDSCAPS2_HINTSTATIC; dwCaps3 = dwCaps4 = 0; yet this does not: dwCaps = DDSCAPS_TEXTURE; dwCaps2 = dwCaps3 = dwCaps4 = 0; The textures get corrupted... either appearing as solid colours or typical garbage memory. Why? All the other flags should just be hints I believe, so why should removing them break anything? Surely the pixel format is the only thing that should determine whether the textures are ok or not. It actually shows bits of the wrong texture when I use the second method, as if the addressing gets messed up. Problem two: what could cause Direct3D to fail to correctly texture a primitive if textures were used in a certain order? If the first texture I use in a frame is texture X, then all other primitives (except those also using texture X) are rendered as solid blocks of colour, as if all the texture coordinates were equal to 0 (they''re not). However, if the first texture is not of texture X, then all primitives are rendered perfectly, including those of texture X. It just doesn''t like that texture going first in a frame. All primitives are rendered with exactly the same calls, the only difference being what is passed to SetTexture, which is never NULL and never returns a fail value. Nor does DrawPrimitive. All texture surfaces are created the same way, with the same flags (ie. the working ones above). Testing on a second PC returns DDERR_INVALIDPIXELFORMAT from DrawPrimitive, despite it having selected the same pixel format (32bit ARGB) on that PC too. I''ve been working on this for hours now and I''m lost as to what could be causing this. [ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]

Share this post


Link to post
Share on other sites
It''s been AAAAAAAAAAAAGGGGESSSSS since I''ve done any 3D DX7 so I''m a tad rusty and have been trying to forget some of the nastiness - I''m trying to jog my memory with a quick look at the code and comments (I knew they''d come in handy for something) in the Pac-Man:Adventures in Time engine from 2000

1) Any textures/surfaces/buffers which are NOT managed should be created ****BEFORE**** anything which is managed or you can end up with a whole lot of pain (I won''t go into details but think along the lines of how the texture manager tracks how much available VRAM it has to play with...)

2) If mipmapping, specify DDSCAPS_COMPLEX

3) DON''T make your own DDPIXELFORMAT structures - only use what comes from EnumTextureFormats.

4) All of Pac-Man:AIT uses the OPAQUE hint regardless of texture type to let the driver swizzle. Middle aged nVidia chips only texture from swizzled surfaces for example - though I think the driver should sort out problems with non swizzled stuff by making a copy etc...

5) Obvious thing A: The texture created in the second method IS in video memory right? (or AGP assuming the chip/driver can texture out of AGP).

6) Obvious thing B: And how are you filling the texture? - when you lock the managed texture, you''ll be locking a system memory copy so the pitch on the lock is always likely to be 0. On the video memory texture there''s a big chance the pitch will not be 0, so all plain memcpy() type texture fills not behaving properly could cause exactly what you describe.

7) So the second problem you describe is absolutely only related to the texture (i.e. 0 other renderstates, stateblocks, settransforms etc being called)? - definately sounds like something is unhappy with a texture being used. In DX7, if you ever get a failure which isn''t documented as being able to come from Draw*Primitive(), then it usually always hints at something which was set up as a device state. I''ve seen similar mad return codes from Draw*Primitive() back on PM:AIT when the artists used non-square or non-power-of-2 textures on certain drivers. Definately check the caps and the texture dimensions. Same goes for texture coordinate repeat counts, mip map levels, filter types etc...

8) RefRast? Debug Runtime? - do they tell you anything?

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
Thanks again for the help, Simon. I wish I had either your knowledge or patience, never mind both.

1) I''m trying to eliminate grief by creating all texture surfaces in the same way. They''re all managed (as seen in the top snippet) and are created after the front/back buffer, etc. Once the basic initialisation is done, all surfaces are created equally. To be honest, the first ''problem'' is not really a problem for me, just something that doesn''t appear to be right, and which I thought might hint at some deeper issue that I need to address.

2) No mipmaps just yet. I figure I should get it working right before I get it looking good.

3) My pixel format structures are formed from enumeration by looking for a format that is 32 bits, has DDPF_ALPHAPIXELS set, and has a non-zero alpha bit mask. In practice this always yields just one format, which is A8_R8_G8_B8. I just copy that format, and then copy it back to DDSURFACEDESC2.ddpfPixelFormat when creating new surfaces for textures. As stated above, this method works fine as long as I use the top set of surface creation caps but not the second set (which I copied from the MTexture example in the SDK, which runs perfectly).

4) The opaque hint says in the docs that it "indicates to the driver that this surface will never be locked again." In that case, how do I fill it with the texture data? I need to lock each texture just once, to fill it, and then never again.

5) No idea - I figured that it would use whatever is appropriate, just not necessarily the most efficient way. I copied the capabilities flags from an SDK example that works, so I assumed it would be fine. See my answer to (1) though.

6) If lPitch can be zero, then that means my docs are wrong... it implies that pitch must be at least equal to width (multiplied by bytes per pixel) and sometimes more, depending on padding. I fill a pixel at a time, and move down by an amount equal to lPitch for each row, as suggested in the docs. It seems to work for everything.

7) My program is quite simple - it''s just a basic tile engine using D3D to draw the sprites. So everything is just a 2-triangle strip drawn with DrawPrimitive. For each tile, it sets up the alpha blending stuff with SetTextureStageState (this is constant across all textures), it calls SetTexture for stage 0 with the relevant DirectDraw surface, then it calls DrawPrimitive with a 4-vertex triangle strip. The only thing that varies is the surface passed to SetTexture. And I know the surfaces are ok, because they render just fine if sent in the ''acceptable'' order. My test program alternates between the right order and the wrong order each frame, to prove that the textures themselves are not corrupted.

DrawPrimitive doesn''t set off the FAILED() macro, so I assume it''s returning ok, rather than returning something unknown. Which caps should I be checking for? I''m using a GeForce2 MX here, so I didn''t think the old power-of-two restrictions and the like would be an issue. But, for what it''s worth, the texture that always works is 128x128, and other textures (which only work as long as the previous one isn''t rendered first) are more irregular. Hmm... maybe I should make it 129x129 and see if it breaks too.

8) The reference rasterizer seems even worse It seems to draw everything wrongly the first time through, and by the 3rd or 4th frame it only ever renders one of the primitives. No failures from any of the calls, of course. And I must not have the debug runtime installed since the debug slider is greyed out in the control panel.

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]

Share this post


Link to post
Share on other sites
1 & 5) Ah yup, I just remembered the DDSCAPS2_D3DTEXTUREMANAGE hint isn''t the way you specify managed versus non-managed textures - I forgot that when I posted the above (I was reading it as the difference between managed and non-managed).
As an experiment, try copying just the DDSCAPS2_D3DTEXTUREMANAGE flag to the second method. That lets the driver take over resource management from D3D.

3) That should be ok assuming you''re taking a DEEP copy of the DDPIXELFORMAT structure rather than just a pointer (I still suspect something is going on concerning pixel formats).

4) The docs are a tad misleading - you can lock and unlock it ONCE to fill in the data. It really just lets the driver know that its ok to take a long time to return from Lock/Unlock calls and put the tetxure in the slowest place for the CPU if it makes the 3D faster. That allows the driver to do things like swizzle and compress the surface as well as putting it in true video memory (sometimes drivers ignore the memory types you specify if they see you doing bad things).

6) That''s what I get for posting that early in the morning after drinking - what I meant is the pitch might be wider than the surface - and that could change with different flags.

quote:
"DrawPrimitive doesn''t set off the FAILED() macro, so I assume it''s returning ok, rather than returning something unknown"


I was actually referring to what you said about testing problem two on a second PC and the DrawPrimitive returning DDERR_INVALIDPIXELFORMAT - which is interesting since it looks like the other PC doesn''t have the debug runtime installed so that could also be the problem on the first PC (but the retail runtimes won''t report many things).


The fact that REFRAST is going mad too definately indicates some kind of issue in your code rather than say a bad driver.

First plan of action IMO is to get the debug runtime working on your main PC and get REFRAST to tell you what it can.




--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
Any hints on how to get the DX7 debug runtimes on here without potentially damaging my DX8 runtimes? Should I just install the DX8 SDK and keep the DX7 docs handy?

Incidentally, changing to DDSCAPS2_OPAQUE, or switching between DDSCAPS2_TEXTUREMANAGE and DDSCAPS2_D3DTEXTUREMANAGE didn''t appear to do much. Basically, the corruption seems to appear any time that I don''t specify one of those 2 texture management flags.

And I can''t see how it could be a problem with the pixel format when the textures render perfectly as long as they''re not used in the non-optimal order But, who knows. I''m definitely taking deep copies of the pixelformat structure (via memset) and it should be ok.

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]

Share this post


Link to post
Share on other sites
When I downloaded dx9 sdk and installed it on dx8 runtime(debug) I had 'debug' option disabled under control panel dx setting. I then downloaded dx9 debug runtime and installed it and got 'debug' option working again. Ms provides dx7.a sdk as a download so perhaps the whole package includes debug runtime as well. I would recommend you do get the debug runtime and crank the lever waaaayyyyy up That texture loading problem you described is bizarre, haven't encountered it myself. Double check your render states and texture stage setup. Disable all texture stages you're not using. I'm not sure but I think ms states that some drivers don't like texture to be set in first stage(voodoo?), maybe I'm dreaming or something Check return value from all dx function calls if you're not doing so already. Sprinkle your code with asserts also. Trace your rendering function thru a debugger and notice if any handles/pointers are not being set to null. Good luck!

[edited by - JD on December 27, 2002 4:30:27 AM]

Share this post


Link to post
Share on other sites
Ok... what exactly is this debug runtime supposed to do? I (think) I got it installed, made sure the debug slider was up to full, ran my program in Windowed mode under the debugger, and... the debug output is identical to when I was using the release libraries. I get nothing from the reference rasteriser either, except DDERR_NODIRECTDRAWHW errors from DrawPrimitive. *sigh* (EDIT: I'm assuming that reinstalling the DX7SDK has screwed up my system at this point, since enumeration no longer returns the HAL or TNL devices. Grr.)

My render states and texture options are trivial (although probably not correct ) Here's all my settings, in rough order:


SetRenderState(D3DRENDERSTATE_LIGHTING, FALSE);
SetRenderState(D3DRENDERSTATE_ALPHABLENDENABLE, TRUE);
SetRenderState(D3DRENDERSTATE_SRCBLEND, D3DBLEND_SRCALPHA);
SetRenderState(D3DRENDERSTATE_DESTBLEND, D3DBLEND_INVSRCALPHA);

SetTextureStageState(0, D3DTSS_MAGFILTER, D3DTFG_POINT)))
SetTextureStageState(0, D3DTSS_MINFILTER, D3DTFN_POINT)))
SetTextureStageState(0, D3DTSS_ALPHAARG1, D3DTA_DIFFUSE);
SetTextureStageState(0, D3DTSS_ALPHAARG2, D3DTA_TEXTURE);
SetTextureStageState(0, D3DTSS_ALPHAOP, D3DTOP_MODULATE);


I'm really stuck for ideas now... I'm gonna keep experimenting with totally random things, but any suggestions are welcome, even if just on how to get some debugging information.

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]

[edited by - Kylotan on December 31, 2002 1:32:14 PM]

Share this post


Link to post
Share on other sites
How are you setting color stage? Look thru IDirect3DDevice7::ValidateDevice() method for some compatibility with hw devices info. I would also set frame buffer alpha op, should be add by default but who knows? Make sure you have the latest drivers installed for your card. I can''t find the info now but I distinctly remember nvidia sample code comment mentioning about dx7 having screwed up alpha or some such. I think I''ve seen it in slimdot3bump sample app. I looked thru it just now but can''t find the info again. Maybe someone remembers what that bug was about. The debug runtime will do some dx internal checks and will report errors to you. It''s silent so perhaps dx7 has some internal bug. I''ve done some crazy blending in my 3d app so I know the nvidia drivers can take the punishment, they''re golden standard

Share this post


Link to post
Share on other sites
Kylotan, I should have suggested this from the start because that''s what I do when things start going south Rip out your rendering code, etc. and build a test app out of it then run it. This way if it works you eliminate rendering algorithm as a possible problem. It''s much easier to trace calls in small test app making sure things are set the way they supposed to rather than trying to trace a big app with lots of unrelated to gfx calls.

Share this post


Link to post
Share on other sites
Colour stages... direct from my code:

// makes no difference
device->SetTextureStageState(0, D3DTSS_COLORARG1, D3DTA_TEXTURE);
device->SetTextureStageState(0, D3DTSS_COLOROP, D3DTOP_MODULATE);
device->SetTextureStageState(0, D3DTSS_COLORARG2, D3DTA_CURRENT/*D3DTA_DIFFUSE*/);


ValidateDevice() returns 1 for everything, as expected, and I only use texture stage 0.

I did some research recently and the latest drivers for my card are poor so I installed one of the better and more reliable drivers from the past. It's interesting that people say Nvidia drivers are so good when in fact, there are a lot of people who have trouble with them. It makes me wonder just how bad the drivers for other companies are...

The debug runtime has never done anything for me. Is there a way I can force it to output something (eg. a call combination that should never work) so that I can test if it's actually working?

As for removing non-rendering code... there is virtually nothing in this program except rendering code So much for writing a quick 2D game...

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]

[edited by - Kylotan on January 1, 2003 5:29:29 PM]

Share this post


Link to post
Share on other sites
Rewrite your color op to this:

D3DTSS_COLORARG1, D3DTA_TEXTURE
D3DTSS_COLOROP, D3DTOP_SELECTARG1

leave out COLORARG2.

Look thru renderstate flags in the docs if there is something about setting a special state when doing vertex alpha, etc. If there is an error the debug runtime will let you know. This bug is probably logical error in your app, like setting flags and so on. I used nvidia''s 12.41 driver for year and a half and no problems. I just switched to 41.09 drv. and again no problems. I guess I''ve been lucky picking correct drivers Try them both however this blending op is so basic that I doubt drivers are buggy. I would still create a small dx7 test app just to make sure the rendering logic and flag are set correctly then when the test app passes, compare it to the rendering code in your normal app.

Share this post


Link to post
Share on other sites
Ignore the rewrite part. Just looked into docs and D3DTA_CURRENT defaults to DIFFUSE. I thought you made a mistake but you didn''t. Sorry.

Share this post


Link to post
Share on other sites
I see you''re using triangle strip as a draw primitive. If your tiles are quads and have four vertices going in cc/ccw order ie. one after another, then strip will not render properly. I just tried it. I use triangle fan for non-indexed four vertices ie. 2 triangles. Also, some time ago I derived MYCUSTOMVERTEX from d3dx vector3 class and I''ve mistakingly made my derived class virtual. That has put an extra 4 bytes for the virtual table pointer into my class. Suffice to say my app didn''t render properly because I omitted this 4 bytes from my vertex stride calculation. Posting your complete drawing/init/texture loading/etc. code would help tremendously to figure out what you''re doing.

Share this post


Link to post
Share on other sites

  
----------Surface Creation--------------

DDSURFACEDESC2 ddsd;
DXInitStruct(ddsd); // this is the memset/size thing

ddsd.dwFlags = DDSD_CAPS | DDSD_HEIGHT | DDSD_WIDTH | DDSD_PIXELFORMAT;
ddsd.ddsCaps.dwCaps = DDSCAPS_TEXTURE | DDSCAPS_WRITEONLY;
ddsd.ddsCaps.dwCaps2 = DDSCAPS2_D3DTEXTUREMANAGE | DDSCAPS2_OPAQUE;
ddsd.ddsCaps.dwCaps3 = 0;
ddsd.ddsCaps.dwCaps4 = 0;
ddsd.dwHeight = y;
ddsd.dwWidth = x;
memcpy(&ddsd.ddpfPixelFormat, &textureFormats[chosenTexFormatIndex], sizeof(DDPIXELFORMAT));

IDirectDrawSurface7* surfaceData = 0;
if (HRESULT res = directDraw->CreateSurface(&ddsd, &surfaceData, NULL))
{
logFile << "Failed to create DDraw surface:" << GetDXReturnCode(res) << endl;
return 0;
}


---------Rendering each tile--------------

// Positions (vectors) first

const float LEFT = static_cast<float>(dest.x);
const float RIGHT = static_cast<float>(dest.x + dest.w);
const float TOP = static_cast<float>(dest.y);
const float BOTTOM = static_cast<float>(dest.y + dest.h);
const float Z_VAL = 0.1f;

const D3DVECTOR tl(LEFT, TOP, Z_VAL);
const D3DVECTOR tr(RIGHT, TOP, Z_VAL);
const D3DVECTOR bl(LEFT, BOTTOM, Z_VAL);
const D3DVECTOR br(RIGHT, BOTTOM, Z_VAL);

// Then actual vertices


opacity = 1.0;

const D3DCOLOR col = D3DRGBA(1.0, 1.0, 1.0, opacity);

D3DTLVERTEX spriteVert[4];

// Cast from Sprite to D3D7_Sprite

const D3D7_Sprite* d3dSprite = static_cast<const D3D7_Sprite*>(sprite);

spriteVert[0] = D3DTLVERTEX(tl, 1, col, col, d3dSprite->u1, d3dSprite->v1); // TL

spriteVert[1] = D3DTLVERTEX(tr, 1, col, col, d3dSprite->u2, d3dSprite->v1); // TR

spriteVert[2] = D3DTLVERTEX(bl, 1, col, col, d3dSprite->u1, d3dSprite->v2); // BL

spriteVert[3] = D3DTLVERTEX(br, 1, col, col, d3dSprite->u2, d3dSprite->v2); // BR


HRESULT hr;

// makes no difference

device->SetTextureStageState(0, D3DTSS_COLORARG1, D3DTA_TEXTURE);
device->SetTextureStageState(0, D3DTSS_COLOROP, D3DTOP_MODULATE);
device->SetTextureStageState(0, D3DTSS_COLORARG2, D3DTA_CURRENT/*D3DTA_DIFFUSE*/);

// Allow vertex alpha, and texture alpha

device->SetTextureStageState(0, D3DTSS_ALPHAARG1, D3DTA_TEXTURE);
device->SetTextureStageState(0, D3DTSS_ALPHAOP, D3DTOP_MODULATE);
device->SetTextureStageState(0, D3DTSS_ALPHAARG2, D3DTA_DIFFUSE);

#ifdef DEBUG
DWORD passes = -10;
hr = device->ValidateDevice(&passes);
Log::Get() << "ValidateDevice: " << GetDXReturnCode(hr) << ", passes: " << passes << endl;
#endif // DEBUG


// Set the appropriate texture

// Cast from Surface to D3D7_Surface

D3D7_Surface* d3dSurface = static_cast<D3D7_Surface*>(d3dSprite->surface);

if (!d3dSurface)
{
#ifdef DEBUG
Log::Get() << "D3D7_Graphics::Render: d3dSurface is NULL!" << endl;
#endif // DEBUG

return false;
}

// This gets the IDirectDrawSurface7* from the D3D7_Surface object

if (FAILED(hr = d3dSurface->GetDDSurface()->IsLost()))
{
#ifdef DEBUG
Log::Get() << "D3D7_Graphics::Render: surface->IsLost(): " << GetDXReturnCode(hr) << endl;
#endif // DEBUG

return false;
}

// Use that as the texture

if (FAILED(hr = device->SetTexture(0, d3dSurface->GetDDSurface())))
{
#ifdef DEBUG
Log::Get() << "D3D7_Graphics::Render: SetTexture: " << GetDXReturnCode(hr) << endl;
#endif // DEBUG

return false;
}

// Draw the object using a DrawPrimitive() call.

if (FAILED(hr = device->DrawPrimitive(D3DPT_TRIANGLESTRIP, D3DFVF_TLVERTEX,
(void*)&spriteVert[0], 4, NULL)))
{
#ifdef DEBUG
Log::Get() << "D3D7_Graphics::Render: DrawPrimitive: " << GetDXReturnCode(hr) << endl;
#endif // DEBUG

return false;
}


That''s my drawing and surface creation code. Will post the init code too if the above checks out ok.

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]

Share this post


Link to post
Share on other sites
I''ve looked at the code but haven''t found anything out of the ordinary. In case you''re letting d3d do the lighting for you the docs recommend to use color vertex and material source renderstate flags to tell the api to take color+alpha from vertices not material. You can post the rest of code and I''ll take a look at it. I can''t compile it though because I''m on dx9 but I have dx7 docs that I use to look up stuff.

P.S. Are you setting dwSize member in DXInitStruct(ddsd)? Your comments indicate you are but I want to make sure.

Share this post


Link to post
Share on other sites