Jump to content
  • Advertisement


This topic is now archived and is closed to further replies.



This topic is 5748 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am using the following code, which I copied and pasted from drunken hyena.
LOGFONT log_font={
   32, //height

   0,  //width; 

   0,  // lfEscapement; 

   0,  //lfOrientation; 

   FW_BOLD, // lfWeight; 

   FALSE, // lfItalic; 

   FALSE, // lfUnderline; 

   FALSE, // lfStrikeOut; 

   DEFAULT_CHARSET, // lfCharSet; 

   OUT_DEFAULT_PRECIS, //lfOutPrecision; 

   CLIP_DEFAULT_PRECIS, // lfClipPrecision; 


   DEFAULT_PITCH,// lfPitchAndFamily; 

   "Arial"// lfFaceName[LF_FACESIZE]; 

The variable "font" is of type LPD3DXFONT and initalized to NULL. When I call D3DXCreateFontIndirect(D3Ddevice, (),&log_font,&font), it fails. I''m guessing one of the values I''m initializing log_font with doesn''t work with my system. I have a GeForce2 MX with Windows XP. Proceeding on a brutal rampage is the obvious choice.

Share this post

Link to post
Share on other sites

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!