Error loading S3TC textures (DXT3 and DXT5)

Started by
0 comments, last by cr88192 9 years, 10 months ago

Hello all

In my web application i am using textures that are mostly compressed using S3TC. While the application on the desktop working with the same textures works great i have some issues in webgl. DXT1 textures (RGB without alpha) get decompressed correctly but DXT3 and DXT5 look wrong. First let me show you some pictures.

1. This is how it should look (desktop application):

fb95aa15ef.png

2. This is how it looks (web app):

4f1a700204.png

I realized that the white parts are due to very low alpha. If i enable blend then all the white stuff is black/darker but still the image obviously is wrong.

I am loading the textures like this:


this.id = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, this.id);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);

for (var i = this.header.mipCount - 1; i >= 0; --i) {
    var curw = this.header.width >> i;
    var curh = this.header.height >> i;
    curw = Math.max(1, curw);
    curh = Math.max(1, curh);

    var size = Math.floor((curw + 3) / 4) * Math.floor((curh + 3) / 4) * blockSize;
    if (compressed == false) {
        size = curw * curh * blockSize;
    }

    var data = this.file.readBytes(size);
    if (compressed == false) {
        gl.texImage2D(gl.TEXTURE_2D, i, gl.RGBA, curw, curh, 0, gl.RGBA, gl.UNSIGNED_BYTE, data);
    } else {
        gl.compressedTexImage2D(gl.TEXTURE_2D, i, glFormat, curw, curh, 0, data);
    }
}

compressed is true, blockSize is 16 (same as in C++ application).


glFormat = Texture.RGBA_S3TC_DXT5_EXT; 
// with:
Texture.RGBA_S3TC_DXT5_EXT = GxContext.S3TC.COMPRESSED_RGBA_S3TC_DXT5_EXT;
// with:
var ext = gl.getExtension(fullName);
if (ext == null) {
    throw new TypeError("S3TC not supported!");
}

GxContext.S3TC = ext;
// with:
fullName = "WEBKIT_WEBGL_compressed_texture_s3tc";

I am a bit confused. The code used for loading the textures is mostly a 1:1 port from my C++ code that works well, but in webgl there seems to be a lot more alpha. Is there something im missing?

Greetings

Plerion

Advertisement

when I was messing with WebGL (to a very limited extent), I had a similar issue.

I had addressed it mostly by disabling Alpha in the Canvas, as apparently the browser interprets the canvas alpha as alpha-blending with the page background, so for any "transparent" parts of the canvas, it peeks through to the page background.

dunno if this helps.

This topic is closed to new replies.

Advertisement