Archived

This topic is now archived and is closed to further replies.

amish1234

D3DXFonts

Recommended Posts

I am using the following code, which I copied and pasted from drunken hyena.
  
LOGFONT log_font={
   32, //height

   0,  //width; 

   0,  // lfEscapement; 

   0,  //lfOrientation; 

   FW_BOLD, // lfWeight; 

   FALSE, // lfItalic; 

   FALSE, // lfUnderline; 

   FALSE, // lfStrikeOut; 

   DEFAULT_CHARSET, // lfCharSet; 

   OUT_DEFAULT_PRECIS, //lfOutPrecision; 

   CLIP_DEFAULT_PRECIS, // lfClipPrecision; 

   ANTIALIASED_QUALITY,// lfQuality; 

   DEFAULT_PITCH,// lfPitchAndFamily; 

   "Arial"// lfFaceName[LF_FACESIZE]; 

   };
  
The variable "font" is of type LPD3DXFONT and initalized to NULL. When I call D3DXCreateFontIndirect(D3Ddevice, (),&log_font,&font), it fails. I''m guessing one of the values I''m initializing log_font with doesn''t work with my system. I have a GeForce2 MX with Windows XP. Proceeding on a brutal rampage is the obvious choice.

Share this post


Link to post
Share on other sites