D3DXCreateTexture problem

Started by
2 comments, last by neneboricua19 18 years, 7 months ago
Hi, I am trying to display a texture but load it from memory. When I load it from a file (using D3DXCreateTextureFromFileEx) it displays fine, but when I try creating a texture and filling it myself it doesn't display at all. Here's the code bit I use. The data is a 512x512 array of shorts. when I replace these lines with the CreateTextureFromFile call it works fine. Anyone has a hint to help me out ?? D3DXCreateTexture(pd3dDevice, 512, 512, 1, D3DUSAGE_DYNAMIC, D3DFMT_L16, D3DPOOL_DEFAULT, &pTileTexture); D3DLOCKED_RECT lr; if(FAILED( pTileTexture->LockRect( 0, &lr, NULL, D3DLOCK_DISCARD )) ) {... } memcpy( (unsigned short *)lr.pBits, myArray, sizeof(unsigned short) * 512 * 512 ); pTileTexture->UnlockRect( 0 );
Advertisement
You cannot use memcpy to fill a D3D texture. You need to take into account the pitch of the surface and not just the width. The driver may allocate a texture surface that is larger than what you requested for memory alignment reasons, or to store additional information that it requries. In your case, you're requesting a texture size of 512x512. The driver may allocate a surface of, say, 600x512 instead.

By the way, you _really_ should check your functions to see if they return errors. Creating a texture, locking it, and then accessing its memory without checking for any errors at all is bound to come back and bite you.

Assuming the variable "myArray" is a one dimentional array, this is how you should fill the texture:
unsigned short *pTextureSurface = reinterpret_cast<unsigned short*>(lr.pBits);for( DWORD dwRow = 0; dwRow < 512; dwRow++ ){  DWORD dwBytesToSkip = dwRow * lr.Pitch; // row * bytes/row = bytes to skip  for( DWORD dwCol = 0; dwCol < 512; dwCol++ )  {    // The reason we divide by 2 here is because each element of the texture is    // a two-byte quantity.    pTextureSurface[dwBytesToSkip/2 + dwCol] = myArray[dwRow*512 + dwCol];  }}

Hope this helps,
neneboricua
thanks for the pitch tip !

don't worry my code checks for errors, it was just to get the snippet a little shorter ;-)

Now, probably a stupid question, but I don't see anything on screen with unsigned shorts (not even a black background or so). Is this normal ? Am I supposed to use a more displayable format then unsigned shorts ? Reading a texture from a file works fine ...

Any hints ? ...

thanks,

matt-
It may depend on how you're trying to display it. When you load the texture from a file, it may load the texture as some other format that can be displayed more easily. Load the texture from the file and call GetLevelDesc to take a look at the format it's actually using.

One way to verify if your manually created texture is being given the right data is to just save it to a file and just take a look at it offline. Use the D3DXSaveTextureToFile function to do this. Right after you unlock the texture, save it to the file and see if you can load it up in some kind of image viewer. Its a quick and dirty way to make sure your texture is valid.

neneboricua

This topic is closed to new replies.

Advertisement