Jump to content
  • Advertisement
Sign in to follow this  
dev578

D3DXCreateFont failing

This topic is 4828 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Well, I finished making my font class, and it works:) I learned a lot from it, but it doesn't compete with the speed offered by ID3DXFont. So this is my problem: if (FAILED(D3DXCreateFont(g_pD3DDevice, //D3D Device 22, //Font height 0, //Font width FW_NORMAL, //Font Weight 1, //MipLevels false, //Italic DEFAULT_CHARSET, //CharSet OUT_DEFAULT_PRECIS, //OutputPrecision ANTIALIASED_QUALITY, //Quality DEFAULT_PITCH|FF_DONTCARE, //PitchAndFamily "Arial", //pFacename &g_pFont))) //ppFont { //if D3DXCreateFont failed } On my computer, it works fine. On a Windows XP Home computer, it works as well. On another Windows XP Home computer, it fails. It also fails on a Windows 98 SE computer. So why is it failing? Why only on some computers? I don't think the Windows XP Home computer it works on has the debug version of DirectX, but then again, it might. Any help is appreciated, -Dev578

Share this post


Link to post
Share on other sites
Advertisement
If the function fails, the return value can be one of the following:

D3DERR_INVALIDCALL The method call is invalid. For example, a method's parameter may have an invalid value.
D3DXERR_INVALIDDATA The data is invalid.
E_OUTOFMEMORY Microsoft Direct3D could not allocate sufficient memory to complete the call.

So I say check the return value then come back if you still can't figure out where your problem is.

Share this post


Link to post
Share on other sites
Well, on the Windows 98 SE machine, it returned D3DXERR_INVALIDDATA. I have no idea why though. The computer has 16 bit color, if that has anything to do with it. It shouldn't though, because the XP Home machine has 32 bit color.

Any help is appreciated,



-Dev578

Share this post


Link to post
Share on other sites
Ahh... don't you love wonderfully vague COM error mesages such as "The data is invalid"?

The only thing that might be failing is the quality parameter. The failing computers might not be able to do font smoothing. I'm just shooting in the dark though, but you're using pretty much default parameters for everything else, so..

Share this post


Link to post
Share on other sites
I hope you don't mind me borrowing some space on this thread dev578, but I'm having problems with D3DXFont too. My font object is simply not drawing the text, I've looked at so many examples, and I'm not doing anything different. Here's the relevant code:

if(FAILED(D3DXCreateFont( D3D_device, 50, 0, FW_BOLD, 1, FALSE, DEFAULT_CHARSET,
OUT_DEFAULT_PRECIS, DEFAULT_QUALITY, DEFAULT_PITCH | FF_DONTCARE,
"Arial", &font )))
{
MessageBox(NULL, "Failed to create D3DX Sprite Interface!", "Error!",
MB_ICONEXCLAMATION | MB_OK);
}



The above code works, font is a valid pointer.

humScoreRect = new RECT;
humScoreRect->left = 0;
humScoreRect->top = 0;
humScoreRect->bottom = 480;
humScoreRect->right = 640;

This function is also called like it should be, yet it doesn't draw anything. sprite is a valid D3DDevice pointer.

font->DrawText(sprite,"Hello people",-1,humScoreRect,DT_TOP | DT_LEFT,D3DCOLOR_RGBA(150,150,0,0));



The above statement is excuted here:

D3D_device->Clear(0,NULL,D3DCLEAR_TARGET,D3DCOLOR_RGBA(0,0,192,255),1.0f,0);
D3D_device->BeginScene();
sprite->Begin(D3DXSPRITE_ALPHABLEND);

drawBG();
drawScores();//It is within this function. It IS executed.
humanPaddle->drawPaddle();
compPaddle->drawPaddle();
ball->drawBall();

sprite->End();
D3D_device->EndScene();
D3D_device->Present(NULL,NULL,NULL,NULL);

Share this post


Link to post
Share on other sites
I'm peacing out for the night, but could someone please make a suggestion? The code is so straightforward like other examples yet it's not working.

Share this post


Link to post
Share on other sites
Ok, here is my code dealing with the font:

//Global
ID3DXFont* g_pFont = NULL;
bool bText = true;

//In WinMain()
if (FAILED(D3DXCreateFont(g_pD3DDevice, 22, 0, FW_NORMAL, 1, false, DEFAULT_CHARSET, OUT_DEFAULT_PRECIS,
ANTIALIASED_QUALITY, DEFAULT_PITCH|FF_DONTCARE, "Arial", &g_pFont)))
{
if (FAILED(D3DXCreateFont(g_pD3DDevice, 22, 0, FW_NORMAL, 1, false, DEFAULT_CHARSET, OUT_DEFAULT_PRECIS,
DEFAULT_QUALITY, DEFAULT_PITCH|FF_DONTCARE, "Arial", &g_pFont)))
{
bText = false;
}
}

RECT font_rect;

//In Rendering Loop
if (bText == true)
{
char String[512];

sprintf(String,"Whatever Text")
SetRect(&font_rect,0,0,32,32);
g_pFont->DrawTextA(NULL,(LPCSTR)&String,-1,&font_rect,DT_LEFT|DT_NOCLIP,0xFFFFFFFF);
}

//End of WinMain()
if(g_pFont) g_pFont->Release();



Again, this works on some computers but not others. The "In WinMain()" section is what is failing though. On the Windows 98 SE computer with 16 bit color, it does not work. The game itself will run, but the font doesn't work. Any ideas?

Any help is appreciaed,



-Dev578

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!