Sign in to follow this  
IvanK

Texture compression / DDS files

Recommended Posts

Hi. I would like to show you my tool https://www.Photopea.com . You can use it as a viewer of .DDS files (works even on your phone). It supports BC1, BC2, BC3 (DXT1, DXT3, DXT5) and BC7 (DX10) compressions. I would like you to test it a little, if you have a minute.

Next, I have a philosophical question regarding the texture distribution. I am new to this area.

As I understand it, we want textures to be small "on the wire" (on a DVD / HDD / delivered over the internet). Next, we want them to be small in the GPU memory. I think it is clear, that any non-GPU-ish lossy compression (such as JPG or WebP) can achieve much better quality/size ratio, than any DXTx format (even zipped DXTx). So JPG or WebP is more suiteble for using "on the wire".

I often see developers directly distributing textues in DXTx format (DDS files) "on the wire". The usual excuse is, that decoding JPG and encoding it into DXTx (at the moment of using the texture) would be too time-consuming (while DXTx can be copied to the GPU without any modifications).

I implemented a very naive DXT1 compression into Photopea (File - Export - DDS) and it is surprisingly fast (1 MPx texture takes 80 ms to encode). So I feel like compressing textures (to DXTx) right before sending them to the GPU makes sense. So what is the purpose of the DDS format? Why do developers distribute textures in the DDS "on the wire", when there are better compression methods?

Edited by IvanK

Share this post


Link to post
Share on other sites

You can compress to BC3 quite fast, but in my experience compressing to BC7 is extremely slow, although I haven't tried the compute shader texture compression functions from directxTex/directXTK so maybe they are faster.

So its possible those developers are using BC7 ?

Block compression modes - https://msdn.microsoft.com/en-us/library/windows/desktop/hh308955(v=vs.85).aspx

Share this post


Link to post
Share on other sites
1 hour ago, CortexDragon said:

but in my experience compressing to BC7 is extremely slow

You are allowed to choose your compression mode for BC7. BC5 and lower compression modes have only one way of compressing.

For a clear overview:

http://www.reedbeta.com/blog/understanding-bcn-texture-compression-formats/

Share this post


Link to post
Share on other sites
2 hours ago, IvanK said:

Why do developers distribute textures in the DDS "on the wire", when there are better compression methods?

Simplicity I guess.

IIRC Rage's megatexture implementation used a highly compressed format that was transcoded to a GPU readable compressed format.

Share this post


Link to post
Share on other sites
2 hours ago, IvanK said:

So what is the purpose of the DDS format?

You compress only a block of 4x4 pixels; a local compression. You can of course do a much better job by considering the full texture as one block and try to compress that at once, which can never result in a worse compression for common textures (of course a single local block can reduce the overall quality of a global compression algorithm and then there are noise texture as well).

But the benefit of local compression is also local decompression which is beneficial for your caches. And since memory bandwidth and latency are really bottlenecks, you'll get the picture ;)

Edited by matt77hias

Share this post


Link to post
Share on other sites

Thanks. I understand BC1 - BC7 compressions, I implemented them all in Photopea (in my own code, everything happens on the CPU).

It is true, that if you want "the best" BCx compression (with the smallest error), it can be very hard to compute (especially in BC7). But in practice, having nearly-best compression is enough (if it was not, you would not use compression at all) and it can be computed in a real time just before sending the texture to the GPU.

So I think that artists should store textures in JPG, and games should load JPGs, parse the raw data, compress into BCx and send to the GPU. BCx compressor can be a tiny library having about 10 kB, there is no need for any huge binaries from nVidia or ATI. Storing textures in JPG is better, because they are always smaller than DDS, while having the same quality. That is why I don't understand, why DDS files are created and distributed.

Share this post


Link to post
Share on other sites
24 minutes ago, IvanK said:

So I think that artists should store textures in JPG

Did you try how much double compression reduces quality on the final texture? I assume it might have issues on normals and roughness because JPG cares for features visible to the human eye, but not for other stuff.

Also, streaming directly from HD without CPU cost is nice to have, but it could be done once during installation.

Share this post


Link to post
Share on other sites
9 minutes ago, JoeJ said:

I assume it might have issues on normals and roughness because JPG cares for features visible to the human eye, but not for other stuff.

I remember i thought about this few years ago. I think the solution would be to modify compression algorithms like jpg in two ways:

1. adapt to usecase of normals, roughness etc.

2. To hide blocky artefacts, interpolate the compression encoding with that of neighouring blocks (similar to how biliniear filtering works). This might reduce quality a bit, but blocky artefacts would disappear at the cost of 4 times slower decompression. 

Not sure if it's worth it.

Share this post


Link to post
Share on other sites

IIRC quake 3 did the store as JPEG, transcode to DXT thing.

If your DXT encoder is fast, then the quality is almost certainly terrible. They might look OK on casual inspection, but your game artists will eventually run into horrible 4x4 block artefacts that look like minecraft and horrible green/purple hue shifts (especially in areas that should be grey) and they'll start compensating by increasing their texture resolution, disabling compression, or just nagging you to fix the engine :D

Look up crunch and binomial. Instead of naively using JPEG, they invent new on-disk formats that are designed to be small but also transcode directly to DXT with minimal runtime cost and quality loss. 

Another avenue is, when creating your DXT/DDS files, you add another constraint to your encoder - - as well as looking for block encodings that produce the best quality, you also search for ones that will produce lower entropy / will ZIP(etc) better. You then sacrifice a little quality by choosing sub-optimal block encodings, but end up with much smaller files over the wire. 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this