D3DXCreateTextureFromFileEx causes breakpoints on any IDirect3DDevice9 function

Started by
12 comments, last by metalmidget 15 years, 11 months ago
I used to load a texture like this:

D3DXCreateTextureFromFile(gD3DDevice, "Images\\frames.bmp", &temptex);

which was working OK, except that I don't want it to stretch my textures to power of 2 dimensions, so I changed it to this:

HR(D3DXCreateTextureFromFileEx(gD3DDevice, "Images\\frames.bmp", 
				D3DX_DEFAULT_NONPOW2, D3DX_DEFAULT_NONPOW2, 
				D3DX_DEFAULT, 0, D3DFMT_UNKNOWN, 3DPOOL_DEFAULT,
				D3DX_FILTER_NONE, D3DX_FILTER_NONE, 0xFFFFFFFF,
				NULL, NULL, &temptex));

As far as I can tell the texture loads fine, and it certainly displays how I want it to. Only problem is, every time I do something that triggers an OnResetDevice(move/resize/maximise the window etc), I continually get breakpoints. The following functions cause breakpoints: IDirect3DDevice9::Reset IDirect3DDevice9::SetTransform IDirect3DDevice9::SetSamplerState IDirect3DDevice9::SetRenderState IDirect3DDevice9::SetTextureStageState At that point I stopped clicking continue, but it appears any function from my d3d device is causing breakpoints. Oh and I'm handling the return values from all these functions with this macro if it helps at all:

#define HR(x)                                      	{                                                  		HRESULT hr = x;                                		if(FAILED(hr))                                 		{                                              			DXTrace(__FILE__, __LINE__, hr, #x, TRUE); 		}                                              	}

And no, the macro isn't mine. It's from a book. cheers, metal
Advertisement
What do the Debug Runtimes say? And have you checked that your card actually supports non power of 2 textures (D3DCAPS9's TextureCaps member doesn't have D3DPTEXTURECAPS_NONPOW2CONDITIONAL or D3DPTEXTURECAPS_POW2 set)?

EDIT: Actually, the problem is that your texture is now in the default pool, when it used to be in the managed pool. When you reset the device, you need to Release() all resources in the default pool, Reset(), then reload them. Which is exactly what the debug runtimes will tell you when you use them.
OK. TBH when I switched to using the Ex function I just used NULL or a default value for anything I wasn't sure of.
So is the solution to change the Pool argument from D3DPOOL_DEFAULT to D3DPOOL_MANAGED? Or is it to release and reload the textures? Like I say, there's no particular reason I put them in the default pool, i just picked it cause it's the default. Are there any other values I've used that differ to what the simple createtexture function uses?

cheers,
metal

EDIT: the CAPS member TextureCaps is 0. I would hope it supports anything D3DXSprite can throw at it, it's a GeForce 8600.
Quote:Original post by metalmidget
OK. TBH when I switched to using the Ex function I just used NULL or a default value for anything I wasn't sure of.
So is the solution to change the Pool argument from D3DPOOL_DEFAULT to D3DPOOL_MANAGED? Or is it to release and reload the textures? Like I say, there's no particular reason I put them in the default pool, i just picked it cause it's the default. Are there any other values I've used that differ to what the simple createtexture function uses?
You'lll always want to use the managed pool. The default pool say "I need this texture to always exist in video memory. You are not allowed to ever remove it unless I explicitly tell you to". The're a few times you need to use the default pool; for dynamic resources and certain types ofd resource like render targets (Since it gets written to on the GPU, it needs to be available).

So yes, juct changing to D3DPOOL_MANAGED should fix the problem. The documentation for D3DXCreateTextureFromFile says:
Quote:The function is equivalent to D3DXCreateTextureFromFileEx(pDevice, pSrcFile, D3DX_DEFAULT, D3DX_DEFAULT, D3DX_DEFAULT, 0, D3DFMT_UNKNOWN, D3DPOOL_MANAGED, D3DX_DEFAULT, D3DX_DEFAULT, 0, NULL, NULL, ppTexture).
Quote:Original post by Evil Steve

So yes, juct changing to D3DPOOL_MANAGED should fix the problem. The documentation for D3DXCreateTextureFromFile says:
Quote:The function is equivalent to D3DXCreateTextureFromFileEx(pDevice, pSrcFile, D3DX_DEFAULT, D3DX_DEFAULT, D3DX_DEFAULT, 0, D3DFMT_UNKNOWN, D3DPOOL_MANAGED, D3DX_DEFAULT, D3DX_DEFAULT, 0, NULL, NULL, ppTexture).


Oh woops I probably should have checked for something like that. It did occur to me but I think I had it in my head for some reason that I'd checked before and not found it...

OK so I change my D3DXCreateTextureFromFileEx call to match D3DXCreateTextureFromFile, other than telling it not to stretch my textures. Now my program is failing to create the IDirect3DDevice9 interface. Also, the first thing I see in my output window is the following:
"D3D9 Helper: Enhanced D3DDebugging disabled; Application was not compiled with D3D_DEBUG_INFO"
I already went to the DX control panel and made sure it was set to run in debug mode. What else do I need to do to get the extra debugging info?
Quote:Original post by metalmidget
OK so I change my D3DXCreateTextureFromFileEx call to match D3DXCreateTextureFromFile, other than telling it not to stretch my textures. Now my program is failing to create the IDirect3DDevice9 interface. Also, the first thing I see in my output window is the following:
"D3D9 Helper: Enhanced D3DDebugging disabled; Application was not compiled with D3D_DEBUG_INFO"
I already went to the DX control panel and made sure it was set to run in debug mode. What else do I need to do to get the extra debugging info?
Those two things are unrelated. If your app is refusing to create the device, it'll tell you why in the debug output.
The second point lets you see some member variables of the D3D interfaces (E.g. you can see the present_parameters of the device in the variable watch window), but it's frequently wrong, and even when it's not, it's rarely useful. Still, if you want to see it, you can define D3D_DEBUG_INFO in your preprocessor settings in your project settings.
This is getting really weird. The debug output is telling me "Direct3D9: (WARN) :HW device not available. GetAdapterCaps fails."
It says it 4 times throughout the my D3D initialisation, before reaching the device creation, which fails.
That makes no sense! I haven't changed any of that code!!!
I tried going into a past project which uses the same D3D intialisation code, which I definitely haven't changed, and now that won't create the device either! :'( It's giving the same warning about GetAdapterCaps failing.
The only thing I can think of is that turning on the extra debugging has buggered something up. It's the only thing I've changed that could affect both projects.
Any other ideas?

cheers,
metal
*phew*
OK, so I went and turned all the extra debugging off, and everything works as it should now. Not sure what was causing the problem though...
Thanks for that.

cheers,
metal
Quote:Original post by metalmidget
*phew*
OK, so I went and turned all the extra debugging off, and everything works as it should now. Not sure what was causing the problem though...
Thanks for that.

cheers,
metal
You've got "Enable Hardware Acceleration" unchecked on the control pannel. Or, you're using the March 2008 SDK (I think it's march anyway), and there's a bug in the control pannel where the checkbox has the opposite meaning (So it should be unchecked).

Disabling the debug runtimes ismn't a good idea while you're developing really - it's just coding blind.
Quote:Original post by Evil Steve
You've got "Enable Hardware Acceleration" unchecked on the control pannel. Or, you're using the March 2008 SDK (I think it's march anyway), and there's a bug in the control pannel where the checkbox has the opposite meaning (So it should be unchecked).

Disabling the debug runtimes ismn't a good idea while you're developing really - it's just coding blind.


Nope, my apps run when and only when "Enable Hardware Acceleration" is unchecked. And I have the November 2007 SDK.
I'm aware it's a bad idea to turn it off. I'd much rather have it on, but at this stage I'd rather have it off than have my apps not run at all.
...
Actually, I can turn on all the other options that were shown in that link you posted. It's just the hardware acceleration that buggers it up.

cheers,
metal

This topic is closed to new replies.

Advertisement