Archived

This topic is now archived and is closed to further replies.

Texture Question

This topic is 5007 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

When I load a texture using the DirectX API, how is stored on my graphics card? Is it stored in the same format that it is stored on the disk, or is it converted to my screen surface format? Or is it only converted to my screen surface format when I use SetTexture? In other words, is the amount of memory each texture takes up dependent on my screen format or on the texture format originally?

Share this post


Link to post
Share on other sites
IIRC, it''s the format of the original texture format (the file on disk). However, if you specify a D3DFORMAT other than D3DFMT_UNKNOWN (look at the D3DXCreateTextureFromFileEx() function) for the requested pixel format, it''s going to depend on that. Specifying D3DMT_UNKNOWN pulls the format from the file.


Dustin Franklin
Mircrosoft DirectX MVP

Share this post


Link to post
Share on other sites
Actually, the driver has some latitude to convert the texture on upload (after you unlock the surface). For example, for higher benchmark scores, drivers have been known to always convert to 16-bit textures, even if you create a 32-bit texture and lock/upload 32-bit data. This behavior is usually controllable through control panel settings.

Share this post


Link to post
Share on other sites