FreeType troubles...

Started by
5 comments, last by Murdocki 12 years, 1 month ago
I am having a really harsh time in trying to implement Font rendering into my Engine... Now I'm mainly getting pissed off with FreeType, I just can't seem to understand it 100%. I'm loading the Font with TrueType and then looping through all the glyphs in the font and saving them into one single big Texture, in ASCII order, and uploading them to OpenGL, but that's not working out so well.

I would really appreciate it if someone looked at my code and explain to me what I am doing wrong. At the moment I can't load a TrueType font and I doubt I'm rendering them correctly either but I am not sure as I have not tested it...

Here is how I am loading fonts; the maximum size of a font is defaulted to 20, to save into a Texture:
void OGLGraphicalAssetLoader::loadFontFromFile(const std::string& filepath, Font* output) const
{
FT_Library library; // a FreeType Library object
FT_Face face; // This holds the TrueType Font.
FT_Error error; // Holds any errors that could occur.
error = FT_Init_FreeType(&library); // Initialize the FreeType Library
if(error)
{
throw AnaxException("FreeType 2 Library could not be initialized", -2);
}
// Load the TrueType Font into memory
error = FT_New_Face(library, filepath.c_str(), 0, &face);
if(error)
{
throw AnaxException("Could not load TrueType Font: " + filepath, -2);
}
FT_Set_Char_Size(face, output->getMaxSize() * 64, output->getMaxSize() * 64, 96, 96); // Set the size of the Font

// Create a blank Texture (Image)
Image tempImage;
tempImage.create(face->glyph->bitmap.width, face->glyph->bitmap.rows);

Rect2DFloat textureCoords; // Holds temporary Texture Coordinates
Uint32 drawX = 0, drawY = 0; // The x and y coordinates that the glypth will be drawn to in the Texture.
// Loop through the Glyph's putting them in the Texture (Image)
for(int i = 0; i < 256; ++i)
{
Uint32 index = FT_Get_Char_Index(face, (char)i);
error = FT_Load_Glyph(face, index, FT_LOAD_DEFAULT);
if(error)
continue; // just ignore it.. (should throw an except or something along those lines
error = FT_Render_Glyph(face->glyph, FT_RENDER_MODE_NORMAL);
if(error)
continue; // just ignore it...

// Place Texture Coordinates
textureCoords.position.x = drawX + face->glyph->bitmap_left;
textureCoords.position.y = drawY - face->glyph->bitmap_top;
textureCoords.size.width = face->glyph->bitmap.width;
textureCoords.size.height = face->glyph->bitmap.rows;
setFontTextureCoordinateValueForGlyth(output, i, textureCoords); // Set the Texture Coordinates
// Render into Image
BlitGlypth(face->glyph, &tempImage, textureCoords.position.x, textureCoords.position.y);

// Increment drawing position
drawX += face->glyph->advance.x >> 6;
drawY += face->glyph->advance.y >> 6;
}
// Upload the Texture to OpenGL
Texture2D tempTexture;
loadTextureFromImage(tempImage, &tempTexture);
// Set the ID of the Font
setFontIdNumber(output, tempTexture.getID());
}



I'm not quite sure if I'm formatting each character correctly into my texture, also I do not know how to get or calculate the size that the Texture will be. When I am calling tempImage.create() it parses 0, 0 for the dimensions of the image... S:? Is it because there is not current glyph selected or..? How do I calculate what the Texture size should be.

Here is how I am drawing the Font's, using a Rectangle to draw them:

void OGLRenderer::renderText(const Renderable2DText& text)
{
const std::string& theCharactersToRender = text.getText();
Renderable2DRect& rect = getRectFromText(text);

// Loop through all the characters
for(int i = 0; i < theCharactersToRender.length(); ++i)
{
const Rect2DFloat& subRect = text.getFont()->getGlypth(i);
rect.setSubRect(subRect);

// Render the Rect
renderRect(rect);
rect.move(subRect.position.x, subRect.position.y);
}
}


If you need anymore detail on how I am implementing this, please say so :)
Advertisement

[font=arial,helvetica,sans-serif]Ahh i remember having this problem myself. It had something to do with [color=#000000]

FT_Set_Char_Size and the glyph methods (ft 2.4.4). I couldnt figure out what it was so i changed using those by using

[/font]FT_Set_Pixel_Sizes and FT_Load_Char instead. Here's what i got so you can take a look at how that works:

FontLoader::FontLoader() :
logger( Logger::Instance() )
{
FT_Error error = FT_Init_FreeType( &library );
if ( error )
logger.Print( LOGGERDEPTH_CONSTRUCTOR, LOGGERSCOPE_NOCHANGE, "cn", "Font loader could not be initialized." );
logger.Print( LOGGERDEPTH_CONSTRUCTOR, LOGGERSCOPE_NOCHANGE, "cn", "ResourceLoaderFont created." );
}
FontLoader::~FontLoader()
{
logger.Print( LOGGERDEPTH_DESTRUCTOR, LOGGERSCOPE_NOCHANGE, "cn", "ResourceLoaderFont destroyed." );
}
EEResource::Font* FontLoader::LoadFont( RenderResourceCreator& renderResourceCreator, const std::string& fontFile, uint8 fontSize )
{
FT_Error error;
FT_Face face;
if( error = (FT_New_Face( library, fontFile.c_str(), 0, &face ) ) )
{
if ( error == FT_Err_Unknown_File_Format )
logger.Print( LOGGERDEPTH_WARNING, LOGGERSCOPE_NOCHANGE, "mcscn", __FUNCTION__, ": Format of font \"", fontFile, "\" is unsupported." );
else
logger.Print( LOGGERDEPTH_WARNING, LOGGERSCOPE_NOCHANGE, "mcscn", __FUNCTION__, ": Font \"", fontFile, "\" could not be opened or read." );
return NULL;
}

//error = FT_Set_Char_Size(
// face, /* handle to face object */
// 0, /* char_width in 1/64th of points */
// 16*64, /* char_height in 1/64th of points */
// 300, /* horizontal device resolution */
// 300 ); /* vertical device resolution */
FT_Set_Pixel_Sizes( face, 0, fontSize );
int textureWidth = 0;
int textureHeight = 0;
EEResource::Font* newFont = new EEResource::Font( fontFile, fontSize );
for( uint16 charIndex = 0; charIndex < 256; ++charIndex )
{
error = FT_Load_Char( face, charIndex, FT_LOAD_RENDER );
Vector2f advance( float( face->glyph->advance.x >> 6 ), float( face->glyph->advance.y >> 6 ) );
uint8 width = face->glyph->bitmap.width;
uint8 height = face->glyph->bitmap.rows;
uint8* data = new uint8[ width * height ];
//Rendered data is upside down so we need to revert that.
for( uint8 row = 0; row < height; ++row )
memcpy( data + width * row, face->glyph->bitmap.buffer + width * (height - row - 1), sizeof( uint8 ) * width );
newFont->SetCharacter( (uint8)charIndex, advance, width, height, face->glyph->bitmap_left, face->glyph->bitmap_top, data );
}

if( !renderResourceCreator.LoadFont( *newFont ) )
{
logger.Print( LOGGERDEPTH_WARNING, LOGGERSCOPE_NOCHANGE, "mcscn", __FUNCTION__, ": Font \"", fontFile, "\" could not be loaded by the renderer." );
delete newFont;
return NULL;
}
else
return newFont;
}


On a side note: i dont think you have to init the library for every font you're loading

Couple of Questions:
1. Why are you parsing the height instead of the width in this function:

[color=#000000]FT_Set_Pixel_Sizes[color=#666600]([color=#000000] face[color=#666600],[color=#000000] [color=#006666]0[color=#666600],[color=#000000] fontSize [color=#666600]);
i.e. why are you calling it like this?

[color=#000000]FT_Set_Pixel_Sizes[color=#666600]([color=#000000] face[color=#666600],[color=#000000] [color=#000000]fontSize[color=#666600],[color=#000000] [color=#006666]0[color=#666600]);

2. What are you doing with textureWidth and textureHeight?

[color=#000088]int[color=#000000] textureWidth [color=#666600]=[color=#000000] [color=#006666]0[color=#666600];
[color=#000088]int[color=#000000] textureHeight [color=#666600]=[color=#000000] [color=#006666]0[color=#666600];
1: it doesn't really matter, this is some code that's inside the FT_Set_Pixel_Sizes for the version i'm using:


if ( pixel_width == 0 )
pixel_width = pixel_height;
else if ( pixel_height == 0 )
pixel_height = pixel_width;

so [color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

FT_Set_Pixel_Sizes

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

(

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

face

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

[/font][color=#006666][font=helvetica, arial, verdana, tahoma, sans-serif]

0

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

fontSize

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

); is the same as

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

FT_Set_Pixel_Sizes

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

(

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

face

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

fontSize, 0

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

); and

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

FT_Set_Pixel_Sizes

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

(

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

face

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

[/font][color=#006666][font=helvetica, arial, verdana, tahoma, sans-serif]

fontSize

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,

[/font][color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

fontSize

[/font][color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

);

[/font]

2: nothing at all haha. I noticed them sitting around there aswell but i was too lazy to remove them :P. I know when i've started creating the fontloader i tried to put all characters in an as small as possible image but i couldn't really be bothered to finish that because gpu mem isn't an issue right now. So i gues it's just a remnant from that.


FT_Set_Pixel_Sizes


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

([/font]


[color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

face[/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,[/font]


[color=#006666][font=helvetica, arial, verdana, tahoma, sans-serif]

0[/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,[/font]


[color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

fontSize [/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

); is the same as [/font]


[color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

FT_Set_Pixel_Sizes[/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

([/font]


[color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

face[/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,[/font]


[color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

fontSize, 0 [/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

); and [/font]


[color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

FT_Set_Pixel_Sizes[/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

([/font]


[color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

face[/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,[/font]


[color=#006666][font=helvetica, arial, verdana, tahoma, sans-serif]

fontSize[/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

,[/font]


[color=#000000][font=helvetica, arial, verdana, tahoma, sans-serif]

fontSize [/font]


[color=#666600][font=helvetica, arial, verdana, tahoma, sans-serif]

);[/font]




Dont you just looooove auto format.....
Okay, I'm still having troubles, I have a question and I am not sure what I'm doing wrong o.O...

1. So with the FT_Set_Pixel_Sizes it sets the amount of pixels a glyph in the font should use for it's width/height, correct? Which means I can't have decimal sizes, right?


Here's my Code, it doesn't complain about loading or rendering, but I must be stuffing up somewhere.

Loading the Font:

void OGLGraphicalAssetLoader::loadFontFromFile(const std::string&amp; filepath, Font* output) const
{
FT_Library library; // a FreeType Library object
FT_Face face; // This holds the TrueType Font.

FT_Error error; // Holds any errors that could occur.

error = FT_Init_FreeType(&amp;library); // Initialize the FreeType Library

if(error)
{
throw AnaxException("FreeType 2 Library could not be initialized", -2);
}

// Load the TrueType Font into memory
error = FT_New_Face(library, filepath.c_str(), 0, &amp;face);
if(error)
{
throw AnaxException("Could not load TrueType Font: " + filepath, -2);
}

float maxSize = output->getMaxSize();
float width = maxSize * 64;


// Set the size of the Font
FT_Set_Pixel_Sizes(face, 0, maxSize);

// Create a blank Texture (Image)
Image tempImage;
tempImage.create(maxSize * 256, maxSize * 2);


Rect2DFloat textureCoords; // Holds temporary Texture Coordinates
int drawX = 0, drawY = maxSize; // The x and y coordinates that the glypth will be drawn to in the Texture.


FT_GlyphSlot slot = face->glyph; // The Glyph slot for the face

// Loop through the Glyph's putting them in the Texture (Image)
for(int i = 0; i < 256; ++i)
{
Uint32 index = FT_Get_Char_Index(face, i);

error = FT_Load_Char(face, index, FT_LOAD_RENDER);
if(error)
continue; // just ignore it..


// Place Texture Coordinates
textureCoords.position.x = drawX + slot->bitmap_left;
textureCoords.position.y = drawY - slot->bitmap_top;
textureCoords.size.width = slot->bitmap.width;
textureCoords.size.height = slot->bitmap.rows;

setFontTextureCoordinateValueForGlyth(output, i, textureCoords); // Set the Texture Coordinates

// Render into Image
BlitGlypth(face->glyph, &amp;tempImage, textureCoords.position.x, textureCoords.position.y);


// Increment drawing position
drawX += face->glyph->advance.x >> 6;
}

// Upload the Texture to OpenGL
Texture2D tempTexture;
loadTextureFromImage(tempImage, &amp;tempTexture);

// Set the ID of the Font
setFontTexture(output, tempTexture);
}



Bltting the Glyph to the Image:

void BlitGlypth(const FT_GlyphSlot glypth, Image *output, Uint32 xForOutput, Uint32 yForOutput)
{
Uint32 x, y;
for(y = 0; y < glypth->bitmap.rows; ++y)
{
for(x = 0; x < glypth->bitmap.width; ++x)
{
Uint32 pixel = glypth->bitmap.buffer[(x * glypth->bitmap.width) + y]; // access the pixel
output->setPixel(xForOutput + x, yForOutput + y, Colour(255, 255, 255, pixel)); // place it in the image
}
}
}


Rendering the Font:


void OGLRenderer::renderText(const Renderable2DText&amp; text)
{
const std::string&amp; theCharactersToRender = text.getText();
Renderable2DRect&amp; rect = getRectFromText(text);

float drawX = 0;
float drawY = 0;

char currentChar = 0;

// Loop through all the characters
for(int i = 0; i < theCharactersToRender.length(); ++i)
{
// Update the current character
currentChar = theCharactersToRender;


if(currentChar == '\n')
{
//drawX = 0;
//drawY += subRect.size.height;
}
else if(currentChar == ' ')
{
//drawX += text.getLineSpacing();
}
else
{
const Rect2DFloat&amp; subRect = text.getFont()->getGlypth(currentChar);
rect.setSubRect(subRect);

// Render the Rect
renderRect(rect);

drawX += subRect.size.width + 1;
}


rect.move(drawX, drawY);
}
}


I've noticed that my Font is rendered sideways or something, and it seems to not be in the correct order or something? S: What I mean is: For example the '+' sign (43 in ASCII) is not 43 characters from the NULL character (0). Now I tried rendering the '+' character (adding 28 to it too, to make it render the '+' character) but all I get when I render it (zoomed in, with a rectangle next to it) is this, how do I fix it?
plussign.png


Here is the Image that was generated from the .ttf (the alpha channels might not show in the .png, I'm not sure S:), I don't know how to get the Image width exactly so it's not giving me excess pixels...:
temp0.png

Many thanks in advance.
You're still using [color=#000000][size=2]

FT_Get_Char_Index, n

ot sure what it does but it has weird results for me aswell, works perfectly fine without though.

[color=#000000]

Your texture coordinate calculation also looks weird but i havn't really looked at it because the output texture seems bollocks to begin with, could you verify it doesnt look this bad on your end?

This topic is closed to new replies.

Advertisement