C# - Detect If Pre-Existing .DDS Contains Alpha Channel?

Started by
14 comments, last by MajinCry 7 years, 7 months ago

New finding, new post.

After doing some reading, I believe it's possible to call a function from a C++ .dll file, from within C#, without rewriting the C++ code.

I compiled the DDSTextureLoader.cpp file to a .dll, and it compiled fine. I think this may be a foxhole worth gettin' stuck in, as DDSTextureLoader has a function that checks the alpha channel of a .dds texture.

Thing is, err, I've no clue how to proceed with this. This is what I've gathered so far:


using System.Runtime.InteropServices;

namespace WindowsFormsApplication1
{
    public partial class formProgramWindow : Form
    {

        [DllImport("DDSTextureLoader.dll")]
        public static extern void LoadTextureDataFromFile(string dllDDSFilename);

That "LoadTextureDataFromFile(string dllDDSFilename);" line should call the following function from the .dll:

    HRESULT LoadTextureDataFromFile(
        _In_z_ const wchar_t* fileName,
        std::unique_ptr<uint8_t[]>& ddsData,
        DDS_HEADER** header,
        uint8_t** bitData,
        size_t* bitSize)

Now, I think that I only need to provide it a string, as there is only one variable (a collection of chars? called fileName) with the _In_ prefix, which I assume is the argument that the function takes, with the other variables being what the function punts out. Hovering my mouse over "HRESULT" causes Visual Studio to say "Typedef long HRESULT". Maybe this causes the function to return a large integer, with subsections that we can reference.

There is also the following function in the .dll:


DDS_ALPHA_MODE GetAlphaMode( _In_ const DDS_HEADER* header )

with DDS_ALPHA_MODE being "Typedef enum DirectX::DDS_ALPHA_MODE"

What I think I need to do (no idea how), is do something like:


intDDSTextureData = LoadTextureDataFromFile(string dllDDSFilename);
GetAlphaMode( _In_ const intDDSTextureData.DDS_Header)

But, err, GetAlphaMode returns an enum as well, whatever that is. Really, at this point, I'm just regurgitating what my mouse tells me when I hover over things.

Yup, no clue what I'm doing. Send help.

Advertisement
Sorry my initial suggestions didn't help earlier on. I can provide more assistance with what you're trying now.

The signature for LoadTextureDataFromFile looks like it will store the results in the four other arguments. In order to call that using DllImport from C#, your C# DllImport line of code would need to have a compatible function signature that .Net can determine is related to the actual function in C++. However, DllImport only works with C functions as far as I know. Besides, that is one *hell* of a complicated function signature to try to get DllImport to accept; I'm not sure what to do about the std::unique_ptr from C#'s point of view even if they did add C++ support to DllImport.

If it were up to me, at this point I would build a "C++/CLI" DLL. C++/CLI is a way of writing code in C++ which can link to normal C and C++ libraries, which you can then call directly from other .Net languages like C# without using the DllImport approach. You would still need to write code that bridges between "unmanaged" code (code that calls the DirectX functions) and "managed" code ('ref classes' which can be passed directly to C#), but at least in C++/CLI you have access to both "C++/CLI" and "plain old C++" simultaneously.

It's potentially simple, but if you're uncomfortable using C++ directly it probably won't be very fun. Then again, you already have a C++ DLL building already, so it's not much of a leap to make a C++/CLI DLL. Another plus is that you probably won't need to write very MUCH C++ code yourself so I wager you could get it working.

In Visual Studio 2015, the project type you want is: Visual C++ -> CLR -> Class Library. The "CLR" part is akin to the "CLI" in C++/CLI.

After you make the C++/CLI DLL, you add it as a reference in the C# project just like you would with other .Net DLLs.
The GetAlphaMode function signature implies that it is figuring things out using the DDS_HEADER alone. If we can figure out what it's looking at, we can go back to your original approach of reading the DDS_HEADER directly from the file in pure C#. Perhaps it's looking at several other fields, not just the dwFlags field.

The main roadblock is that I've no experience 'n' knowledge with C++ and C#; this project is something I started with literally 0 knowledge of C#, after all. I'm a wee bit more competent with Pascal, but even then, I'm not exactly a pro with that either. Doesn't help that documentation is rather lacking, as well.

Here's the code for the GetAlphaMode function:

    DDS_ALPHA_MODE GetAlphaMode( _In_ const DDS_HEADER* header )
    {
        if ( header->ddspf.flags & DDS_FOURCC )
        {
            if ( MAKEFOURCC( 'D', 'X', '1', '0' ) == header->ddspf.fourCC )
            {
                auto d3d10ext = reinterpret_cast<const DDS_HEADER_DXT10*>( (const char*)header + sizeof(DDS_HEADER) );
                auto mode = static_cast<DDS_ALPHA_MODE>( d3d10ext->miscFlags2 & DDS_MISC_FLAGS2_ALPHA_MODE_MASK );
                switch( mode )
                {
                case DDS_ALPHA_MODE_STRAIGHT:
                case DDS_ALPHA_MODE_PREMULTIPLIED:
                case DDS_ALPHA_MODE_OPAQUE:
                case DDS_ALPHA_MODE_CUSTOM:
                    return mode;
                }
            }
            else if ( ( MAKEFOURCC( 'D', 'X', 'T', '2' ) == header->ddspf.fourCC )
                      || ( MAKEFOURCC( 'D', 'X', 'T', '4' ) == header->ddspf.fourCC ) )
            {
                return DDS_ALPHA_MODE_PREMULTIPLIED;
            }
        }

        return DDS_ALPHA_MODE_UNKNOWN;
    }

To break it down, it looks ddspf.flags and the DDS string @ offset 0x54. If it's DX10, it looks in misFlags2 for the DDS_MISC_FLAGS2_ALPHA_MODE_MASK flag. Somehow, that returns four different integers. STRAIGHT returns 1, PREMULTIPLIED returns 2, OPAQUE returns 3, CUSTOM returns 4 and UNKNOWN returns 0.

If the texture has DXT2 or DXT4, it just decides that it always has alpha channel data. Bloody thing doesn't seem as robust as I thought. Ergh.

Taking a look at a couple variables with "alpha" in their name, I've got to say that C++ makes absolutely no sense with how it handles hexadecimal.

#define DDS_ALPHA       0x00000002  // DDPF_ALPHA
 
...
 
enum DDS_MISC_FLAGS2
{
    DDS_MISC_FLAGS2_ALPHA_MODE_MASK = 0x7L,
};

I assume that the defined DDS_ALPHA variable is just a dword with a value of 2. Since .DDS files are LittleEndian, it would look like "00 00 00 02" in the hex viewer...Which isn't present in any .dds files with an alpha channel. At least, not with Fallout 4's DXT5 textures.

That could mean that the alpha mode mask is at the file offset 0x7L, but that's not even valid hex.

Urgh. Brain hurt. I also tried looking at the Intel Texture Works source code to figure out how the Photoshop .dds plugin detects if an image has an alpha channel, but it's not any better. It uses a variable called "planes" to check if there is an alpha channel or not. Don't ask me what that means, I've nae idea either.

I'm not familiar with the internal workings of DXT5, but I found this table when I searched:
https://en.wikipedia.org/wiki/S3_Texture_Compression#S3TC_Format_Comparison

It seems like DXT5 is always capable of alpha, but doesn't use *premultiplied* alpha. The -pmalpha argument to texconv means "convert to premultiplied alpha". DXT1, 2, and 4 use premultiplied alpha according to that wikipedia page. But just because a format is capable of alpha doesn't mean it USES alpha. Every pixel might be fully opaque. If you need to actually check to see if alpha is actually used or not, you'll probably have to actually load the entire file and check every pixel somehow.


I'm curious: What happens in your tool if you pass -pmalpha for a texture that doesn't have alpha? Does Fallout 4 fail to load the texture? Does it look strange in game? Does the tool report an error?

S3TC is the compression used for Direct3D 9 games and older. Far as I know, most Direct3D 10 & 11 games moved to the new BC format (Direct3D 11 added BC6 and BC7 compression). https://msdn.microsoft.com/en-us/library/windows/desktop/bb694531(v=vs.85).aspx

At least, BCn compression is what Fallout 4 uses in it's textures. Whilst some BCn files have DXTn in their header, they don't use the older S3TC formats.

I'll go give that a try now. As for what happens when you run a texture with alpha, through texconv without passing an alpha argument, it fills the transparent parts of the image with neon yellow.

[spoiler]

tGnc66N.png

[/spoiler]

The landscape textures have an alpha channel in the original .dds files. However, running them through texconv.exe corrupts the alpha channel, which I believe is due to texconv.exe not having an alpha argument passed.

Edit: Just tested. I took the textures for the machete weapon, and chucked them into texconv.exe with the -pmalpha flag. Spawned a copy of the weapon, and nothing seems funky.

What I think happens, is that texconv.exe just generates an empty alpha channel (all black) when it's passed a texture to calculate the alpha for, when there aren't any transparent pixels in the first place.

This topic is closed to new replies.

Advertisement