LOGFONT log_font={
32, //height
0, //width;
0, // lfEscapement;
0, //lfOrientation;
FW_BOLD, // lfWeight;
FALSE, // lfItalic;
FALSE, // lfUnderline;
FALSE, // lfStrikeOut;
DEFAULT_CHARSET, // lfCharSet;
OUT_DEFAULT_PRECIS, //lfOutPrecision;
CLIP_DEFAULT_PRECIS, // lfClipPrecision;
ANTIALIASED_QUALITY,// lfQuality;
DEFAULT_PITCH,// lfPitchAndFamily;
"Arial"// lfFaceName[LF_FACESIZE];
};
D3DXFonts
I am using the following code, which I copied and pasted from drunken hyena.
The variable "font" is of type LPD3DXFONT and initalized to NULL. When I call D3DXCreateFontIndirect(D3Ddevice, (),&log_font,&font), it fails. I''m guessing one of the values I''m initializing log_font with doesn''t work with my system. I have a GeForce2 MX with Windows XP.
Proceeding on a brutal rampage is the obvious choice.
What error does it fail with? Are you using the debug version of D3DX? If so it should spew out some useful info on why it''s faling.
Stay Casual,
Ken
Drunken Hyena
Stay Casual,
Ken
Drunken Hyena
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement