Jump to content
  • Advertisement
Sign in to follow this  
ferrant

D3DXCreateTexture problem

This topic is 4820 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I am trying to display a texture but load it from memory. When I load it from a file (using D3DXCreateTextureFromFileEx) it displays fine, but when I try creating a texture and filling it myself it doesn't display at all. Here's the code bit I use. The data is a 512x512 array of shorts. when I replace these lines with the CreateTextureFromFile call it works fine. Anyone has a hint to help me out ?? D3DXCreateTexture(pd3dDevice, 512, 512, 1, D3DUSAGE_DYNAMIC, D3DFMT_L16, D3DPOOL_DEFAULT, &pTileTexture); D3DLOCKED_RECT lr; if(FAILED( pTileTexture->LockRect( 0, &lr, NULL, D3DLOCK_DISCARD )) ) {... } memcpy( (unsigned short *)lr.pBits, myArray, sizeof(unsigned short) * 512 * 512 ); pTileTexture->UnlockRect( 0 );

Share this post


Link to post
Share on other sites
Advertisement
You cannot use memcpy to fill a D3D texture. You need to take into account the pitch of the surface and not just the width. The driver may allocate a texture surface that is larger than what you requested for memory alignment reasons, or to store additional information that it requries. In your case, you're requesting a texture size of 512x512. The driver may allocate a surface of, say, 600x512 instead.

By the way, you _really_ should check your functions to see if they return errors. Creating a texture, locking it, and then accessing its memory without checking for any errors at all is bound to come back and bite you.

Assuming the variable "myArray" is a one dimentional array, this is how you should fill the texture:

unsigned short *pTextureSurface = reinterpret_cast<unsigned short*>(lr.pBits);
for( DWORD dwRow = 0; dwRow < 512; dwRow++ )
{
DWORD dwBytesToSkip = dwRow * lr.Pitch; // row * bytes/row = bytes to skip
for( DWORD dwCol = 0; dwCol < 512; dwCol++ )
{
// The reason we divide by 2 here is because each element of the texture is
// a two-byte quantity.
pTextureSurface[dwBytesToSkip/2 + dwCol] = myArray[dwRow*512 + dwCol];
}
}

Hope this helps,
neneboricua

Share this post


Link to post
Share on other sites
thanks for the pitch tip !

don't worry my code checks for errors, it was just to get the snippet a little shorter ;-)

Now, probably a stupid question, but I don't see anything on screen with unsigned shorts (not even a black background or so). Is this normal ? Am I supposed to use a more displayable format then unsigned shorts ? Reading a texture from a file works fine ...

Any hints ? ...

thanks,

matt-

Share this post


Link to post
Share on other sites
It may depend on how you're trying to display it. When you load the texture from a file, it may load the texture as some other format that can be displayed more easily. Load the texture from the file and call GetLevelDesc to take a look at the format it's actually using.

One way to verify if your manually created texture is being given the right data is to just save it to a file and just take a look at it offline. Use the D3DXSaveTextureToFile function to do this. Right after you unlock the texture, save it to the file and see if you can load it up in some kind of image viewer. Its a quick and dirty way to make sure your texture is valid.

neneboricua

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!