• 9
• 9
• 9
• 10
• 10

# Weird texture issue rendering Freetype font

This topic is 424 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hello again.

I'm rendering text in my game by creating 2 triangles consisting of 6 vertices (position and UV coords). I'm then getting freetype to render to grayscale glyphs and copying that to a temporary byte array to build a single texture containing the complete string. The 6 vertices mentioned before are positioned according to the requested rectangle area to render the string in and then, with a vertex shader, converted to the correct 2D coordinates, relative to the client window:

vec2 Vertex2D = vec2(VertexPosition.x, VertexPosition.y);
Vertex2D.xy /= ClientRect.xy;
Vertex2D.y = (1.0 - Vertex2D.y);
Vertex2D.xy = Vertex2D.xy * 2.0 - 1.0;

gl_Position = vec4(Vertex2D.x, Vertex2D.y, VertexPosition.z, 1.0);


The texture is then rendered using the usual texture sampling, using only one channel though:

vec4 GetTextFragColour()
{
vec4 Sampled = vec4(1.0, 1.0, 1.0, texture2D(Sampler, UV.st).r);
return ShapeData.Colour1 * Sampled;
} 

This works fine for the most part, however, every now and then I'll choose a size for the rectangle primitive that completely messes up the rendering. This is me rendering the text "Hello" with a width of 120 and height of 50:

If I go ahead and change the width to 119 and leave the height the same, I get this:

This has stumped me completely. I initially looked at the UV coords using RenderDoc, but the 2 images above all render with the same UV coordinates. I then looked at how I was copying the pixels from the freetype glyph into the temporary texture:

for (int Y = 0; Y < Character->GlyphData.Bitmap.Height; Y++)
{
PenX = abs(Offset + Character->Metrics.HorizontalBearingX);

if (PenX > RectSize.Width)
{
break;
}

if (Y + YOffset < RectSize.Height)
{
for (int X = 0; X < Character->GlyphData.Bitmap.Width; X++)
{
if (PenX < RectSize.Width)
{
NewValue = Character->GlyphData.Bitmap.ImageData[Character->GlyphData.Bitmap.Width * Y + X];
OldValue = Texture[(Y + YOffset) * RectSize.Width + PenX];

//only write if the current pixel has no data in it, or if the existing data is less visible than the new data
if ((OldValue == 0) || (OldValue < NewValue))
{
Texture[(Y + YOffset) * RectSize.Width + PenX] = NewValue;
}

PenX += 1;
}
else
{
break;
}
}
}
else
{
break;
}
}


In the above code snippet, Offset is just the combined widths off all preceding characters, PenX is the X coordinate in the temporary texture and YOffset is the offset along the Y coordinate of the temporary texture. You'll see the central theme of that code is converting from 2D coordinates to 1D arrays. I have a sneaking suspicion that may be where the issue is, but I've compared the resulting coordinates between working and non working runs and I can't find anything untoward. I've also tried to find a pattern in the values that produce a messed up texture, but can't find anything there either.

Would anybody have any ideas on what might be wrong, or maybe even suggestions on how to find the cause of this problem?

Thanks!

##### Share on other sites

If I go ahead and change the width to 119

If you actually create texture of that width (119), then check the value of GL_UNPACK_ALIGNMENT option.

By default unpack aligment equals 4, means every row is supposed to be aligned on 4-byte boundary.

That's why width of 120 might work ok, while 119 is not (assuming you're not using rgba, which is already 4-byte aligned).

Either set unpack alignment to 1 with glPixelStorei(), or make your data's row stride be 4-byte aligned.