D3DXFonts

Started by
0 comments, last by amish1234 21 years, 3 months ago
I am using the following code, which I copied and pasted from drunken hyena.
  
LOGFONT log_font={
   32, //height

   0,  //width; 

   0,  // lfEscapement; 

   0,  //lfOrientation; 

   FW_BOLD, // lfWeight; 

   FALSE, // lfItalic; 

   FALSE, // lfUnderline; 

   FALSE, // lfStrikeOut; 

   DEFAULT_CHARSET, // lfCharSet; 

   OUT_DEFAULT_PRECIS, //lfOutPrecision; 

   CLIP_DEFAULT_PRECIS, // lfClipPrecision; 

   ANTIALIASED_QUALITY,// lfQuality; 

   DEFAULT_PITCH,// lfPitchAndFamily; 

   "Arial"// lfFaceName[LF_FACESIZE]; 

   };
  
The variable "font" is of type LPD3DXFONT and initalized to NULL. When I call D3DXCreateFontIndirect(D3Ddevice, (),&log_font,&font), it fails. I''m guessing one of the values I''m initializing log_font with doesn''t work with my system. I have a GeForce2 MX with Windows XP. Proceeding on a brutal rampage is the obvious choice.
___________________________________________________________Where to find the intensity (Updated Dec 28, 2004)Member of UBAAG (Unban aftermath Association of Gamedev)
Advertisement
What error does it fail with? Are you using the debug version of D3DX? If so it should spew out some useful info on why it''s faling.

Stay Casual,

Ken
Drunken Hyena
Stay Casual,KenDrunken Hyena

This topic is closed to new replies.

Advertisement