• Advertisement
Sign in to follow this  

D3DXCreateTextureFromFileEx question

This topic is 4792 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Will folowing code load 16 bit image and convert it to 32 bit with 8 bits (one byte) for each colour component(D3DFMT_A8R8G8B8) ?
// Load the texture
HRESULT hr=D3DXCreateTextureFromFileEx(gD3dDevice, filename, 0, 0,
1 ,0 ,D3DFMT_A8R8G8B8, D3DPOOL_MANAGED,
D3DX_FILTER_NONE ,D3DX_FILTER_NONE ,0 ,NULL ,NULL ,&texture );"


Share this post


Link to post
Share on other sites
Advertisement
My guess would be that it would fail since the format of the image is different than the format that you specified. I do not think DirectX will do the conversion of formats for you.

Anyone else feel free to correct me if I am wrong.

Chris

Share this post


Link to post
Share on other sites
Alright, I just checked and found out that D3DXCreateTextureFromFileEx converts Image Data into format you specified. Ether that or D3DXSaveTextureToFileInMemory converts everything into A8R8G8B8.

Share this post


Link to post
Share on other sites
Quote:
Original post by Coder
D3DX can automatically convert the file to the given format IF your card supports it. Why don't you link to d3dx9d.lib and check your debug output?
Check the Forum FAQ for more details on debugging.


For some reason I dont get any output from D3DXCreateTextureFromFileInMemoryEx

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement