Jump to content
  • Advertisement
Sign in to follow this  
_under7_

Question about encoding data to DXT compressed DDS

This topic is 4857 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi everyone! I'm working on a little tool that supposed to convert DDS files to different formats. for ex. 32bit argb to dxt1 or vice versa. ddses are pretty simple things in terms of format, unless you get to encoding DXT compression, that's where i'm stuck. Any ideas on how to do that? it seems that DXT compression is some kind of pallet building algorithm, where in the end you get 2 colors per 16 texels block. i've got no idea, on how to pick those colors, can anybody help?... P.S. ok, let's take the simpliest case - DTX1. plain opaque texture I suspect, the outline for the encoding algorithm goes like that: 1) Build pallete for each 4x4 texels block (colors must be 5-6-5 format) 2) Using some kind of analysis, fill each texel block with one of 4 values: 00, 01, 10 or 11, where 00 is first color for the block and 11 - is the second one. 01 and 10 - are derived from 00 and 11 using interpolaton... ahem.

Share this post


Link to post
Share on other sites
Advertisement
one quickie i could think of is to modify texture tool source that comes with the SDK.

open the file in whatever format and save them to DXTn

Share this post


Link to post
Share on other sites
This is an error-minimization problem. You need to come up with some metric for error given the 16 original colors and your two endpoint colors. Sum of squared differences is probably not a bad choice. Then you need to figure out a way of picking the two endpoint colors as to minimize that error. This is not as easy as it looks, because the error is not continuous as you move the two endpoints. It jumps as texels move from being closer to one discrete interpolated value to the next.

Its an fun problem to solve.. enjoy!

xyzzy

Share this post


Link to post
Share on other sites
Quote:
Original post by DBX
The DXT formats are fully described in the DX SDK.


You call 4 tiny pages on DDS format and 4 more on compressed textures -- "fully described?" have you ever read them? i really doubt it.

Share this post


Link to post
Share on other sites
Quote:
Original post by xyzzy00
This is an error-minimization problem. You need to come up with some metric for error given the 16 original colors and your two endpoint colors. Sum of squared differences is probably not a bad choice. Then you need to figure out a way of picking the two endpoint colors as to minimize that error. This is not as easy as it looks, because the error is not continuous as you move the two endpoints. It jumps as texels move from being closer to one discrete interpolated value to the next.

Its an fun problem to solve.. enjoy!

xyzzy


Having fun already! :)
Thanks for getting me started, i had really no idea where to begin. Anyway. The problem i described is just a part of a bigger thing. I've got texture manager here that supposed to load jpeg, png or bmp textures, convert them to raw data, resize depenging on current chosen screen resolution and then convert the result to compressed dds. Seems crazy, huh? The reason it's done this way is that textures we use are in huge resolutions, 2048x1536 for ex. I need to minimize disk usage( that why textures are originally jpegs ) as well as sys and vcard memory(dxt compr. comes in handy). So, basically, in case user chooses 640x480, i'd resize and recompress all the textures, sparing memory and disk space( why keep several different versions of textures on hdd? ). All of this must be done in runtime during textures preload routine, so it makes the problem even more fun.

aaanyway, thanks a lot for your help! :) gotta get back to work

PS
in case any interesting thought about all above comes to your head, please tell me! %)

[Edited by - _under7_ on March 30, 2005 1:09:13 AM]

Share this post


Link to post
Share on other sites
Quote:
Original post by _under7_
Quote:
Original post by DBX
The DXT formats are fully described in the DX SDK.


You call 4 tiny pages on DDS format and 4 more on compressed textures -- "fully described?" have you ever read them? i really doubt it.


Yes, and yes. I found it extremely helpful, although it was a few years ago (DX8) when I was working with DXT tools. They may have changed the documentation since (I haven't looked at it recently), but I'm pretty certain the DX8 docs had everything necessary.

Share this post


Link to post
Share on other sites
Quote:
Original post by _under7_
Having fun already! :)
Thanks for getting me started, i had really no idea where to begin. Anyway. The problem i described is just a part of a bigger thing. I've got texture manager here that supposed to load jpeg, png or bmp textures, convert them to raw data, resize depenging on current chosen screen resolution and then convert the result to compressed dds. Seems crazy, huh? The reason it's done this way is that textures we use are in huge resolutions, 2048x1536 for ex. I need to minimize disk usage( that why textures are originally jpegs ) as well as sys and vcard memory(dxt compr. comes in handy). So, basically, in case user chooses 640x480, i'd resize and recompress all the textures, sparing memory and disk space( why keep several different versions of textures on hdd? ). All of this must be done in runtime during textures preload routine, so it makes the problem even more fun.


You do realize that D3DXCreateTextureFromFileEx can do ALL of this for you already, right? It can scale and DXT-compress your textures as you load them. Just specify the dimensions and format you want.

xyzzy

Share this post


Link to post
Share on other sites
Quote:
You do realize that D3DXCreateTextureFromFileEx can do ALL of this for you already, right? It can scale and DXT-compress your textures as you load them. Just specify the dimensions and format you want.
xyzzy


Here's the funny part, afaik, d3d scales textures to the power of 2 (in case your hardware dosn't support non pow 2 textures ). And i don't need that...

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!