Jump to content

  • Log In with Google      Sign In   
  • Create Account

Error loading S3TC textures (DXT3 and DXT5)

  • You cannot reply to this topic
1 reply to this topic

#1 Plerion   Members   -  Reputation: 354


Posted 08 June 2014 - 06:56 AM

Hello all


In my web application i am using textures that are mostly compressed using S3TC. While the application on the desktop working with the same textures works great i have some issues in webgl. DXT1 textures (RGB without alpha) get decompressed correctly but DXT3 and DXT5 look wrong. First let me show you some pictures.


1. This is how it should look (desktop application):



2. This is how it looks (web app):



I realized that the white parts are due to very low alpha. If i enable blend then all the white stuff is black/darker but still the image obviously is wrong. 


I am loading the textures like this:

this.id = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, this.id);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);

for (var i = this.header.mipCount - 1; i >= 0; --i) {
    var curw = this.header.width >> i;
    var curh = this.header.height >> i;
    curw = Math.max(1, curw);
    curh = Math.max(1, curh);

    var size = Math.floor((curw + 3) / 4) * Math.floor((curh + 3) / 4) * blockSize;
    if (compressed == false) {
        size = curw * curh * blockSize;

    var data = this.file.readBytes(size);
    if (compressed == false) {
        gl.texImage2D(gl.TEXTURE_2D, i, gl.RGBA, curw, curh, 0, gl.RGBA, gl.UNSIGNED_BYTE, data);
    } else {
        gl.compressedTexImage2D(gl.TEXTURE_2D, i, glFormat, curw, curh, 0, data);

compressed is true, blockSize is 16 (same as in C++ application).

glFormat = Texture.RGBA_S3TC_DXT5_EXT; 
// with:
// with:
var ext = gl.getExtension(fullName);
if (ext == null) {
    throw new TypeError("S3TC not supported!");

GxContext.S3TC = ext;
// with:
fullName = "WEBKIT_WEBGL_compressed_texture_s3tc";

I am a bit confused. The code used for loading the textures is mostly a 1:1 port from my C++ code that works well, but in webgl there seems to be a lot more alpha. Is there something im missing?





#2 BGB   Crossbones+   -  Reputation: 1545


Posted 08 June 2014 - 10:08 PM

when I was messing with WebGL (to a very limited extent), I had a similar issue.


I had addressed it mostly by disabling Alpha in the Canvas, as apparently the browser interprets the canvas alpha as alpha-blending with the page background, so for any "transparent" parts of the canvas, it peeks through to the page background.


dunno if this helps.