Texture compression / DDS files

Started by
10 comments, last by Hodgman 6 years, 5 months ago

Hi. I would like to show you my tool https://www.Photopea.com . You can use it as a viewer of .DDS files (works even on your phone). It supports BC1, BC2, BC3 (DXT1, DXT3, DXT5) and BC7 (DX10) compressions. I would like you to test it a little, if you have a minute.

Next, I have a philosophical question regarding the texture distribution. I am new to this area.

As I understand it, we want textures to be small "on the wire" (on a DVD / HDD / delivered over the internet). Next, we want them to be small in the GPU memory. I think it is clear, that any non-GPU-ish lossy compression (such as JPG or WebP) can achieve much better quality/size ratio, than any DXTx format (even zipped DXTx). So JPG or WebP is more suiteble for using "on the wire".

I often see developers directly distributing textues in DXTx format (DDS files) "on the wire". The usual excuse is, that decoding JPG and encoding it into DXTx (at the moment of using the texture) would be too time-consuming (while DXTx can be copied to the GPU without any modifications).

I implemented a very naive DXT1 compression into Photopea (File - Export - DDS) and it is surprisingly fast (1 MPx texture takes 80 ms to encode). So I feel like compressing textures (to DXTx) right before sending them to the GPU makes sense. So what is the purpose of the DDS format? Why do developers distribute textures in the DDS "on the wire", when there are better compression methods?

Advertisement

You can compress to BC3 quite fast, but in my experience compressing to BC7 is extremely slow, although I haven't tried the compute shader texture compression functions from directxTex/directXTK so maybe they are faster.

So its possible those developers are using BC7 ?

Block compression modes - https://msdn.microsoft.com/en-us/library/windows/desktop/hh308955(v=vs.85).aspx

Here is a related paper about BC6H: https://knarkowicz.files.wordpress.com/2016/03/knarkowicz_realtime_bc6h_gdc_2016.pdf

I would assume that finding the best compression possible is NP-hard, but i'm not sure (what do you think?).

So it's a matter of quality to do it offline.

 

1 hour ago, CortexDragon said:

but in my experience compressing to BC7 is extremely slow

You are allowed to choose your compression mode for BC7. BC5 and lower compression modes have only one way of compressing.

For a clear overview:

http://www.reedbeta.com/blog/understanding-bcn-texture-compression-formats/

🧙

2 hours ago, IvanK said:

Why do developers distribute textures in the DDS "on the wire", when there are better compression methods?

Simplicity I guess.

IIRC Rage's megatexture implementation used a highly compressed format that was transcoded to a GPU readable compressed format.

-potential energy is easily made kinetic-

2 hours ago, IvanK said:

So what is the purpose of the DDS format?

You compress only a block of 4x4 pixels; a local compression. You can of course do a much better job by considering the full texture as one block and try to compress that at once, which can never result in a worse compression for common textures (of course a single local block can reduce the overall quality of a global compression algorithm and then there are noise texture as well).

But the benefit of local compression is also local decompression which is beneficial for your caches. And since memory bandwidth and latency are really bottlenecks, you'll get the picture ;)

🧙

Thanks. I understand BC1 - BC7 compressions, I implemented them all in Photopea (in my own code, everything happens on the CPU).

It is true, that if you want "the best" BCx compression (with the smallest error), it can be very hard to compute (especially in BC7). But in practice, having nearly-best compression is enough (if it was not, you would not use compression at all) and it can be computed in a real time just before sending the texture to the GPU.

So I think that artists should store textures in JPG, and games should load JPGs, parse the raw data, compress into BCx and send to the GPU. BCx compressor can be a tiny library having about 10 kB, there is no need for any huge binaries from nVidia or ATI. Storing textures in JPG is better, because they are always smaller than DDS, while having the same quality. That is why I don't understand, why DDS files are created and distributed.

Do the higher BC modes like BC7 render faster in the gpu than the lower equivalent modes like BC3 ?

The documentation seems to imply it, but I haven't timed it.

24 minutes ago, IvanK said:

So I think that artists should store textures in JPG

Did you try how much double compression reduces quality on the final texture? I assume it might have issues on normals and roughness because JPG cares for features visible to the human eye, but not for other stuff.

Also, streaming directly from HD without CPU cost is nice to have, but it could be done once during installation.

9 minutes ago, JoeJ said:

I assume it might have issues on normals and roughness because JPG cares for features visible to the human eye, but not for other stuff.

I remember i thought about this few years ago. I think the solution would be to modify compression algorithms like jpg in two ways:

1. adapt to usecase of normals, roughness etc.

2. To hide blocky artefacts, interpolate the compression encoding with that of neighouring blocks (similar to how biliniear filtering works). This might reduce quality a bit, but blocky artefacts would disappear at the cost of 4 times slower decompression. 

Not sure if it's worth it.

This topic is closed to new replies.

Advertisement