Sign in to follow this  
Malazar

Invalid texture pointer?

Recommended Posts

i've been using a class to create sprites and load the texture, and in a previous program, i used this class alot, however in my new project, i get: "D3DX: pTexture pointer is invalid" when i run the program, and am presented with a blank screen. the pointer to the texture is declaired: LPDIRECT3DTEXTURE9 g_pPlayerTexture = NULL; and is only refered to as g_pPlayerTexture. in my previous programs, this works perfectly, and i cannot for the life of me think why it's all of a sudden invalid. any thoughts?

Share this post


Link to post
Share on other sites
Where in your new project you change the value of g_pPlayerTexture ? (i.e. it points to a valid texture)

can you post your relevant code here ?

Share this post


Link to post
Share on other sites
I'm using this texture loading function:

D3DXCreateTextureFromFileEx( g_pd3dDevice,
"Filename.bmp",
160,
120,
1,
D3DPOOL_DEFAULT,
D3DFMT_UNKNOWN,
D3DPOOL_DEFAULT,
D3DX_DEFAULT,
D3DX_DEFAULT,
D3DCOLOR_COLORVALUE(0.0f,0.0f,0.0f,1.0f),
NULL,
NULL,
&g_pPlayerTexture );

Share this post


Link to post
Share on other sites
I'm not certain i know what you mean by "creation" however, it is still null when it reaches my sprite drawing function:

RECT PlayerRect;
PlayerRect.top = TOP;
PlayerRect.left = LEFT;
PlayerRect.bottom = BOTTOM;
PlayerRect.right = RIGHT;

D3DXVECTOR3 v_HCenter( 0.0f, 0.0f, 0.0f );
D3DXVECTOR3 v_HPos( Pos_x, Pos_y, Pos_z );

g_pPlayerSprite->Begin( D3DXSPRITE_ALPHABLEND );
g_pPlayerSprite->Draw( g_pPlayerTexture,
&PlayerRect,
&v_HCenter,
&v_HPos,
D3DCOLOR_COLORVALUE(1.0f, 1.0f, 1.0f, 1.0f) );
g_pPlayerSprite->End();

EDIT:
So i know where this thing it bugged, it's NULLed here, still, however, i can't figure out WHY (and thus fix it) does anyone know why the texture is still NULL and how to fix it?

[Edited by - Malazar on May 23, 2007 6:22:37 PM]

Share this post


Link to post
Share on other sites
It must be something with your texture creation.

Try the following :


D3DXCreateTextureFromFileEx( g_pd3dDevice,
"Filename.bmp",
D3DX_DEFAULT,
D3DX_DEFAULT,
D3DX_DEFAULT,
0,
D3DFMT_UNKNOWN,
D3DPOOL_DEFAULT,
D3DX_DEFAULT,
D3DX_DEFAULT,
D3DCOLOR_COLORVALUE(0.0f,0.0f,0.0f,1.0f),
NULL,
NULL,
&g_pPlayerTexture );


Note that the texture size will be taken from the bitmap file and the mipmap level is set to default.
Another thing which might be the problem : the 5th parameter is not a D3DPOOL related. Try using D3DUSAGE value or 0 if you don't need a specific usage for this texture.

Share this post


Link to post
Share on other sites
Assuming you're using MSVC 2005, put a breakpoint on the line after the D3DXCreateTextureFromFileEx() call, and then when the debugger hits it, go to Debug -> Breakpoints -> New Data Breakpoint (I think, that's just off the top of my head), and enter &g_pPlayerTexture there. Then you'll get a breakpoint when that variable changes.

Share this post


Link to post
Share on other sites
thanks guys. added the break points and found out where it was going wrong. my load texture function in my main cpp file wasn't working properly. i got it fixed now.

also, designerx, i find that setting the texture size to default causes my program to slow down massively, wheras hard-coding the size makes it run nice and smooth :D

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this