Need help using D3DXCreateTextureFromFileEx

Started by
1 comment, last by Krik 11 years, 9 months ago
Been beating my head against the wall for the last few day and so I am hoping someone can help. I am relatively new to programming in C++ (been programming in PHP [and other languages] for several years) and in the last few months I started playing around with DirectX.

First, the original code I am trying to modify, and that does work (I think most of it came from a tutorial).

LPDIRECT3DTEXTURE9 Dx_Manager::GetTexture(char* filename)
{
for (int i = 0; i < MAX_TEXTURES; ++i)
{
if (!d3d_texture.texture && d3d_texture.inuse)
{
continue;
}
d3d_texture.inuse = true;
d3d_texture.filename = filename;

D3DXCreateTextureFromFileA(d3d_device, d3d_texture.filename, &d3d_texture.texture);

return d3d_texture.texture;
}
return NULL;
}

Apparently "D3DXCreateTextureFromFileA" insists that graphics be sized to the power of 2 (2, 4, 8, 16, 32, etc), and while it can be dealt with, by making unused parts of the graphic transparent, I can foresee problems with scaling as well as interactive graphics elements overlapping each other. And on top of that there is the wasted memory for the larger than necessary graphic.

So I am thinking that using D3DXCreateTextureFromFileEx would be better. Here is what, I think, I need to replace line 12 (above) with:

D3DXIMAGE_INFO info;
HRESULT result = D3DXGetImageInfoFromFile(filename.c_str(), &info);
if (result != D3D_OK)
{
return NULL;
}

D3DXCreateTextureFromFileEx(
d3d_device,
filename.c_str(),
info.Width,
info.Height,
1,
D3DPOOL_DEFAULT,
D3DFMT_UNKNOWN,
D3DPOOL_DEFAULT,
D3DX_DEFAULT,
D3DX_DEFAULT,
0xFF000000,
&info,
NULL,
&d3d_texture.texture
);

But in both the places that I use "filename.c_str()" I get this error
error C2228: left of '.c_str' must have class/struct/union
If take off the ".c_str()" I get errors similar to this
error C2664: 'D3DXGetImageInfoFromFileW' : cannot convert parameter 1 from 'char *' to 'LPCWSTR'
I have tried changing the type of "filename" from "char*" to "WCHAR*", "TCHAR", "LPCWSTR", "LPCSTR" as well as looked up ways to convert "char*" to other types and all have failed to work. And really I am not even sure changing the type definition is the solution to the problem.

Any direction on this would be much appreciated.

Also in researching a solution, I read in one place that "D3DXCreateTextureFromFileA" is easier on graphics cards than "D3DXCreateTextureFromFileEx". So if there is a completely different approach I should be looking at let me know.
Advertisement
Since filename is a char*, why do you append .c_str()? It's not a std::string.

For the function call, try D3DXCreateTextureFromFileExA to use the ASCII version.

Fruny: Ftagn! Ia! Ia! std::time_put_byname! Mglui naflftagn std::codecvt eY'ha-nthlei!,char,mbstate_t>

Since filename is a char*, why do you append .c_str()? It's not a std::string.[/quote]
I had tried removing it but had ran into other problems, so I wasn't sure it was the answer, just didn't have the other part of the solution. I would have never thought appending an "A" to the function would fix my code. For all the searching I did never once saw that suggested.

I do have some followup questions based on that answer.

In researching a solution for my problem I saw several places that recomended that Unicode (W) be used instead of ANSI (A). Am I understanding that correctly? It does appears that if I use "WCHAR*" instead of "char*" and append a "W" to the functions it works also. So which is the recommended route?

Also I appended the "A" (also tried "W") to both the "D3DXGetImageInfoFromFile" and the "D3DXCreateTextureFromFileEx" functions. Is it safe to assume that "A" (and "W") can be appened to most of the DirectX functions?

This topic is closed to new replies.

Advertisement