D3D7 Texturing woes

Started by
13 comments, last by Kylotan 21 years, 3 months ago
Sorry for the undescriptive subject; I''m at a loss for how to describe some of the issues I come across. And yes, I know I''m now 2 whole versions of DirectX behind. I want to finish this project rather than keep restarting when a new API comes out. Not to mention that I want this to run on Cyrix processors, which DX8 doesn''t, AFAIK. Anyway... Problem one: These surface creation caps for my textures work: dwCaps = DDSCAPS_TEXTURE | DDSCAPS_WRITEONLY; dwCaps2 = DDSCAPS2_D3DTEXTUREMANAGE | DDSCAPS2_HINTSTATIC; dwCaps3 = dwCaps4 = 0; yet this does not: dwCaps = DDSCAPS_TEXTURE; dwCaps2 = dwCaps3 = dwCaps4 = 0; The textures get corrupted... either appearing as solid colours or typical garbage memory. Why? All the other flags should just be hints I believe, so why should removing them break anything? Surely the pixel format is the only thing that should determine whether the textures are ok or not. It actually shows bits of the wrong texture when I use the second method, as if the addressing gets messed up. Problem two: what could cause Direct3D to fail to correctly texture a primitive if textures were used in a certain order? If the first texture I use in a frame is texture X, then all other primitives (except those also using texture X) are rendered as solid blocks of colour, as if all the texture coordinates were equal to 0 (they''re not). However, if the first texture is not of texture X, then all primitives are rendered perfectly, including those of texture X. It just doesn''t like that texture going first in a frame. All primitives are rendered with exactly the same calls, the only difference being what is passed to SetTexture, which is never NULL and never returns a fail value. Nor does DrawPrimitive. All texture surfaces are created the same way, with the same flags (ie. the working ones above). Testing on a second PC returns DDERR_INVALIDPIXELFORMAT from DrawPrimitive, despite it having selected the same pixel format (32bit ARGB) on that PC too. I''ve been working on this for hours now and I''m lost as to what could be causing this. [ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]
Advertisement
It''s been AAAAAAAAAAAAGGGGESSSSS since I''ve done any 3D DX7 so I''m a tad rusty and have been trying to forget some of the nastiness - I''m trying to jog my memory with a quick look at the code and comments (I knew they''d come in handy for something) in the Pac-Man:Adventures in Time engine from 2000

1) Any textures/surfaces/buffers which are NOT managed should be created ****BEFORE**** anything which is managed or you can end up with a whole lot of pain (I won''t go into details but think along the lines of how the texture manager tracks how much available VRAM it has to play with...)

2) If mipmapping, specify DDSCAPS_COMPLEX

3) DON''T make your own DDPIXELFORMAT structures - only use what comes from EnumTextureFormats.

4) All of Pac-Man:AIT uses the OPAQUE hint regardless of texture type to let the driver swizzle. Middle aged nVidia chips only texture from swizzled surfaces for example - though I think the driver should sort out problems with non swizzled stuff by making a copy etc...

5) Obvious thing A: The texture created in the second method IS in video memory right? (or AGP assuming the chip/driver can texture out of AGP).

6) Obvious thing B: And how are you filling the texture? - when you lock the managed texture, you''ll be locking a system memory copy so the pitch on the lock is always likely to be 0. On the video memory texture there''s a big chance the pitch will not be 0, so all plain memcpy() type texture fills not behaving properly could cause exactly what you describe.

7) So the second problem you describe is absolutely only related to the texture (i.e. 0 other renderstates, stateblocks, settransforms etc being called)? - definately sounds like something is unhappy with a texture being used. In DX7, if you ever get a failure which isn''t documented as being able to come from Draw*Primitive(), then it usually always hints at something which was set up as a device state. I''ve seen similar mad return codes from Draw*Primitive() back on PM:AIT when the artists used non-square or non-power-of-2 textures on certain drivers. Definately check the caps and the texture dimensions. Same goes for texture coordinate repeat counts, mip map levels, filter types etc...

8) RefRast? Debug Runtime? - do they tell you anything?

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

Thanks again for the help, Simon. I wish I had either your knowledge or patience, never mind both.

1) I''m trying to eliminate grief by creating all texture surfaces in the same way. They''re all managed (as seen in the top snippet) and are created after the front/back buffer, etc. Once the basic initialisation is done, all surfaces are created equally. To be honest, the first ''problem'' is not really a problem for me, just something that doesn''t appear to be right, and which I thought might hint at some deeper issue that I need to address.

2) No mipmaps just yet. I figure I should get it working right before I get it looking good.

3) My pixel format structures are formed from enumeration by looking for a format that is 32 bits, has DDPF_ALPHAPIXELS set, and has a non-zero alpha bit mask. In practice this always yields just one format, which is A8_R8_G8_B8. I just copy that format, and then copy it back to DDSURFACEDESC2.ddpfPixelFormat when creating new surfaces for textures. As stated above, this method works fine as long as I use the top set of surface creation caps but not the second set (which I copied from the MTexture example in the SDK, which runs perfectly).

4) The opaque hint says in the docs that it "indicates to the driver that this surface will never be locked again." In that case, how do I fill it with the texture data? I need to lock each texture just once, to fill it, and then never again.

5) No idea - I figured that it would use whatever is appropriate, just not necessarily the most efficient way. I copied the capabilities flags from an SDK example that works, so I assumed it would be fine. See my answer to (1) though.

6) If lPitch can be zero, then that means my docs are wrong... it implies that pitch must be at least equal to width (multiplied by bytes per pixel) and sometimes more, depending on padding. I fill a pixel at a time, and move down by an amount equal to lPitch for each row, as suggested in the docs. It seems to work for everything.

7) My program is quite simple - it''s just a basic tile engine using D3D to draw the sprites. So everything is just a 2-triangle strip drawn with DrawPrimitive. For each tile, it sets up the alpha blending stuff with SetTextureStageState (this is constant across all textures), it calls SetTexture for stage 0 with the relevant DirectDraw surface, then it calls DrawPrimitive with a 4-vertex triangle strip. The only thing that varies is the surface passed to SetTexture. And I know the surfaces are ok, because they render just fine if sent in the ''acceptable'' order. My test program alternates between the right order and the wrong order each frame, to prove that the textures themselves are not corrupted.

DrawPrimitive doesn''t set off the FAILED() macro, so I assume it''s returning ok, rather than returning something unknown. Which caps should I be checking for? I''m using a GeForce2 MX here, so I didn''t think the old power-of-two restrictions and the like would be an issue. But, for what it''s worth, the texture that always works is 128x128, and other textures (which only work as long as the previous one isn''t rendered first) are more irregular. Hmm... maybe I should make it 129x129 and see if it breaks too.

8) The reference rasterizer seems even worse It seems to draw everything wrongly the first time through, and by the 3rd or 4th frame it only ever renders one of the primitives. No failures from any of the calls, of course. And I must not have the debug runtime installed since the debug slider is greyed out in the control panel.

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]
1 & 5) Ah yup, I just remembered the DDSCAPS2_D3DTEXTUREMANAGE hint isn''t the way you specify managed versus non-managed textures - I forgot that when I posted the above (I was reading it as the difference between managed and non-managed).
As an experiment, try copying just the DDSCAPS2_D3DTEXTUREMANAGE flag to the second method. That lets the driver take over resource management from D3D.

3) That should be ok assuming you''re taking a DEEP copy of the DDPIXELFORMAT structure rather than just a pointer (I still suspect something is going on concerning pixel formats).

4) The docs are a tad misleading - you can lock and unlock it ONCE to fill in the data. It really just lets the driver know that its ok to take a long time to return from Lock/Unlock calls and put the tetxure in the slowest place for the CPU if it makes the 3D faster. That allows the driver to do things like swizzle and compress the surface as well as putting it in true video memory (sometimes drivers ignore the memory types you specify if they see you doing bad things).

6) That''s what I get for posting that early in the morning after drinking - what I meant is the pitch might be wider than the surface - and that could change with different flags.

quote:"DrawPrimitive doesn''t set off the FAILED() macro, so I assume it''s returning ok, rather than returning something unknown"


I was actually referring to what you said about testing problem two on a second PC and the DrawPrimitive returning DDERR_INVALIDPIXELFORMAT - which is interesting since it looks like the other PC doesn''t have the debug runtime installed so that could also be the problem on the first PC (but the retail runtimes won''t report many things).


The fact that REFRAST is going mad too definately indicates some kind of issue in your code rather than say a bad driver.

First plan of action IMO is to get the debug runtime working on your main PC and get REFRAST to tell you what it can.




--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Simon O'Connor | Technical Director (Newcastle) Lockwood Publishing | LinkedIn | Personal site

Any hints on how to get the DX7 debug runtimes on here without potentially damaging my DX8 runtimes? Should I just install the DX8 SDK and keep the DX7 docs handy?

Incidentally, changing to DDSCAPS2_OPAQUE, or switching between DDSCAPS2_TEXTUREMANAGE and DDSCAPS2_D3DTEXTUREMANAGE didn''t appear to do much. Basically, the corruption seems to appear any time that I don''t specify one of those 2 texture management flags.

And I can''t see how it could be a problem with the pixel format when the textures render perfectly as long as they''re not used in the non-optimal order But, who knows. I''m definitely taking deep copies of the pixelformat structure (via memset) and it should be ok.

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]
When I downloaded dx9 sdk and installed it on dx8 runtime(debug) I had 'debug' option disabled under control panel dx setting. I then downloaded dx9 debug runtime and installed it and got 'debug' option working again. Ms provides dx7.a sdk as a download so perhaps the whole package includes debug runtime as well. I would recommend you do get the debug runtime and crank the lever waaaayyyyy up That texture loading problem you described is bizarre, haven't encountered it myself. Double check your render states and texture stage setup. Disable all texture stages you're not using. I'm not sure but I think ms states that some drivers don't like texture to be set in first stage(voodoo?), maybe I'm dreaming or something Check return value from all dx function calls if you're not doing so already. Sprinkle your code with asserts also. Trace your rendering function thru a debugger and notice if any handles/pointers are not being set to null. Good luck!

[edited by - JD on December 27, 2002 4:30:27 AM]
Ok... what exactly is this debug runtime supposed to do? I (think) I got it installed, made sure the debug slider was up to full, ran my program in Windowed mode under the debugger, and... the debug output is identical to when I was using the release libraries. I get nothing from the reference rasteriser either, except DDERR_NODIRECTDRAWHW errors from DrawPrimitive. *sigh* (EDIT: I'm assuming that reinstalling the DX7SDK has screwed up my system at this point, since enumeration no longer returns the HAL or TNL devices. Grr.)

My render states and texture options are trivial (although probably not correct ) Here's all my settings, in rough order:
SetRenderState(D3DRENDERSTATE_LIGHTING, FALSE);SetRenderState(D3DRENDERSTATE_ALPHABLENDENABLE, TRUE); SetRenderState(D3DRENDERSTATE_SRCBLEND, D3DBLEND_SRCALPHA);SetRenderState(D3DRENDERSTATE_DESTBLEND, D3DBLEND_INVSRCALPHA);SetTextureStageState(0, D3DTSS_MAGFILTER, D3DTFG_POINT)))SetTextureStageState(0, D3DTSS_MINFILTER, D3DTFN_POINT)))SetTextureStageState(0, D3DTSS_ALPHAARG1, D3DTA_DIFFUSE);SetTextureStageState(0, D3DTSS_ALPHAARG2, D3DTA_TEXTURE);SetTextureStageState(0, D3DTSS_ALPHAOP, D3DTOP_MODULATE);  


I'm really stuck for ideas now... I'm gonna keep experimenting with totally random things, but any suggestions are welcome, even if just on how to get some debugging information.

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]

[edited by - Kylotan on December 31, 2002 1:32:14 PM]
How are you setting color stage? Look thru IDirect3DDevice7::ValidateDevice() method for some compatibility with hw devices info. I would also set frame buffer alpha op, should be add by default but who knows? Make sure you have the latest drivers installed for your card. I can''t find the info now but I distinctly remember nvidia sample code comment mentioning about dx7 having screwed up alpha or some such. I think I''ve seen it in slimdot3bump sample app. I looked thru it just now but can''t find the info again. Maybe someone remembers what that bug was about. The debug runtime will do some dx internal checks and will report errors to you. It''s silent so perhaps dx7 has some internal bug. I''ve done some crazy blending in my 3d app so I know the nvidia drivers can take the punishment, they''re golden standard
Kylotan, I should have suggested this from the start because that''s what I do when things start going south Rip out your rendering code, etc. and build a test app out of it then run it. This way if it works you eliminate rendering algorithm as a possible problem. It''s much easier to trace calls in small test app making sure things are set the way they supposed to rather than trying to trace a big app with lots of unrelated to gfx calls.
Colour stages... direct from my code:
// makes no differencedevice->SetTextureStageState(0, D3DTSS_COLORARG1, D3DTA_TEXTURE);device->SetTextureStageState(0, D3DTSS_COLOROP, D3DTOP_MODULATE);device->SetTextureStageState(0, D3DTSS_COLORARG2, D3DTA_CURRENT/*D3DTA_DIFFUSE*/);  


ValidateDevice() returns 1 for everything, as expected, and I only use texture stage 0.

I did some research recently and the latest drivers for my card are poor so I installed one of the better and more reliable drivers from the past. It's interesting that people say Nvidia drivers are so good when in fact, there are a lot of people who have trouble with them. It makes me wonder just how bad the drivers for other companies are...

The debug runtime has never done anything for me. Is there a way I can force it to output something (eg. a call combination that should never work) so that I can test if it's actually working?

As for removing non-rendering code... there is virtually nothing in this program except rendering code So much for writing a quick 2D game...

[ MSVC Fixes | STL | SDL | Game AI | Sockets | C++ Faq Lite | Boost | Asking Questions | Organising code files | My stuff ]

[edited by - Kylotan on January 1, 2003 5:29:29 PM]

This topic is closed to new replies.

Advertisement