Archived

This topic is now archived and is closed to further replies.

optimal texture format

This topic is 5173 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

since d3d video adapters can support any combination of render target and texture formats, what is the best way to handle this? i.e., if using an X8R8G8B8 render target format, what texture format should be used? or if that format isn''t supported, what next? or how about when using a R5G6B5 format? any of the texture format''s could be associated with the render target format. not only that, but some textures have alpha channels while others dont.

Share this post


Link to post
Share on other sites
the alpha channels are a different matter... but as a quick note, you''re best off trying to match the texture format with the display mode...

firstly it should (although this is entirely guess work) be marginally faster as the hardware/driver wont need to convert the data from 16-32bit format (a 16bit pixel being written to a 32bit buffer).

secondly, it''ll give you better visual results. Going from 16->32 means that you have to "make up" data (565 format can only express 32 reds, 64 greens and 32 blues - x888 format can express 256 shades of each). Going from 32->16 will lose accuracy (converting from 256 shades down to either 32 or 64)...

bottom line... _X8R8G8B8 goes with _X8R8G8B8 and/or A8R8G8B8, _R5G6B5 goes with _R5G6B5 or _A1R5G5B5...

hth
Jack

Share this post


Link to post
Share on other sites
i figured that, that the depths should be the same between render targets and textures, but for each depth there are multiple formats. i.e. for 32 bits there is A8R8G8B8, X8R8G8B8, A2B10G10R10 and for 16 bit formats there are quite a few: R5G6B5, X1R5G5B5, A1R5G5B5, A4R4G4B4, A8R3G3B2, etc.

for each render target format, any number of those texture formats are supported, and however much it''d be nice to garuntee that a render target''s format can also be used for textures, the dxsdk docs do not say anything about it.

even though the sdk doesn''t say its garunteed, should it just be assumed that a render target must also support a texture format of the same type? even if that was assumed, it doesn''t account for what to do with textures that need an alpha channel.

Share this post


Link to post
Share on other sites
You shouldn't need to assume anything.

See

IDirect3D9::GetDeviceCaps,
IDirect3D9::CheckDeviceFormat and
IDirect3D9::CheckDeviceFormatConversion

for more info what any given device/driver combination can do.

-Nik

EDIT:

D3DXCheckTextureRequirements tests if a given format is available, and corrects the format automatically if not.
This is for convenience only.

[edited by - Nik02 on October 15, 2003 7:53:42 AM]

Share this post


Link to post
Share on other sites
quote:
Original post by jollyjeffers
bottom line... _X8R8G8B8 goes with _X8R8G8B8 and/or A8R8G8B8, _R5G6B5 goes with _R5G6B5 or _A1R5G5B5...



Well, it''s not quite that simple. It''s not a bad general rule, but it is oversimplified. Most (probably all cards in the last few generations) will do any mixing/blending/etc internally at 32-bit resolution or better. In this case, even with a 16-bit backbuffer, using 32-bit textures will look better than 16-bit textures.

Also, if you need more than 4-bits of alpha, in most cases you need to go up to A8R8G8B8 simply because other formats with more than 4 bits of alpha are rarely supported.

However, many textures are quite limited, and storing them in 32-bits is a waste of space and bandwidth.

There is no optimal format or combination. You need to look at what you are doing and decide based on that.



Stay Casual,

Ken
Drunken Hyena

Share this post


Link to post
Share on other sites