# Spent all day tracking down this VC8/VC7.1 STL bug.

This topic is 4368 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

So here it is. Tried it with both compilers. First I thought it was a problem with my program, but alas it is not, and I've boiled the bug down to a couple of small source files. Basically, it has to do with locale, DLLs, and a situation dinkum never expected, I suppose. Here's the first source file. This is the EXE and we'll call it "client.exe":
#define WIN32_LEAN_AND_MEAN
#include <windows.h>
#include <fstream>

typedef unsigned char byte;

int main()
{
std::basic_fstream<byte> foo;
HMODULE h = LoadLibrary( "server.dll" );
FreeLibrary(h);
}


The second source file is for the DLL. We'll call it "server.dll":
#include <fstream>

typedef unsigned char byte;

// it doesn't matter if this is static or if
// it's instantiated on the stack or the heap,
// the result is the same.
std::basic_fstream<byte> bar;

// this will force the linker to produce a DLL
__declspec(dllexport) void TestMethod()
{
}


Now, when these are compiled in "Multithreaded DLL" or more specifically in my case "Debug Multithreaded DLL", the client will crash post-main. It appears to be doing this because it's not reference counting the "facet" for the byte type correctly, and it's trying to delete the "facet" object twice! When you change the type from byte to a type it expects, e.g., char, wchar_t, or unsigned short, it doesn't crash. In my code, I had changed my stream type to byte, as shown in this example, because I'm reading UTF-8 data and I thought it made the most sense, considering that libxml also treats UTF-8 data as "unsigned char" as its xmlChar type. Since I'm reading data into libxml from files using fstream I thought it would make sense to use basic_fstream<unsigned char> as a type. So the lesson here is that using VC8/VC7.1 you simply can't reliably use basic_fstream<byte> or presumably any stream type with a non-character type... I tried 'int' and 'unsigned' and the results were the same. Traceing through the code reveals that it automatically creates the facet objects for what it expects to be the basic character types char, wchar_t, and unsigned short. I haven't tried this on gcc yet; it looks like a MS/dinkum specific issue. /sigh

##### Share on other sites
Quote:
 Original post by krumI haven't tried this on gcc yet; it looks like a MS/dinkum specific issue./sigh

I hope that you've posted this on one of Microsoft's newsgroups so that they're aware.