Jump to content
  • Advertisement
Sign in to follow this  
Gykonik

SpriteFont render Umlaute

This topic is 924 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Introduction

Hello, I'm programming my own game at the moment (2D Platformer) and for that I have a Spritefont-Class, to render text to the screen.

This also works very well but in the fact, that I'm German it would be really nice, if I could render German Umlauts, like ä, ö, ü.

I don't know, how I can do this with my existing class...

 

Code

SpriteFont.h

#pragma once

#ifndef SpriteFont_h__
#define SpriteFont_h__

#include <TTF/SDL_ttf.h>
#include <glm/glm.hpp>
#include <map>
#include <vector>

#include "Vertex.h"

namespace Bengine {

    struct GLTexture;
    class SpriteBatch;

    struct CharGlyph {
    public:
        char character;
        glm::vec4 uvRect;
        glm::vec2 size;
    };

#define FIRST_PRINTABLE_CHAR ((char)32)
#define LAST_PRINTABLE_CHAR ((char)126)

    /// For text justification
    enum class Justification {
        LEFT, MIDDLE, RIGHT
    };

    class SpriteFont {
    public:
		SpriteFont() {};
        SpriteFont(const char* font, int size, char cs, char ce);
        SpriteFont(const char* font, int size) :
            SpriteFont(font, size, FIRST_PRINTABLE_CHAR, LAST_PRINTABLE_CHAR) {
        }

		void init(const char* font, int size);
		void init(const char* font, int size, char cs, char ce);
		
		/// Destroys the font resources
        void dispose();

        int getFontHeight() const {
            return _fontHeight;
        }

        /// Measures the dimensions of the text
        glm::vec2 measure(const char* s);

        /// Draws using a spritebatch
        void draw(SpriteBatch& batch, const char* s, glm::vec2 position, glm::vec2 scaling, 
                  float depth, ColorRGBA8 tint, Justification just = Justification::LEFT);
    private:
        static std::vector<int>* createRows(glm::ivec4* rects, int rectsLength, int r, int padding, int& w);

		int _regStart, _regLength;
        CharGlyph* _glyphs;
        int _fontHeight;

        unsigned int _texID;
    };

}

#endif // SpriteFont_h__

SpriteFont.cpp

#include "SpriteFont.h"

#include "SpriteBatch.h"

#include <SDL/SDL.h>
#include <iostream>


int closestPow2(int i) {
	i--;
	int pi = 1;
	while (i > 0) {
		i >>= 1;
		pi <<= 1;
	}
	return pi;
}

#define MAX_TEXTURE_RES 4096

namespace Bengine {

	SpriteFont::SpriteFont(const char* font, int size, char cs, char ce) {
		init(font, size, cs, ce);
	}

	void SpriteFont::init(const char* font, int size) {
		init(font, size, FIRST_PRINTABLE_CHAR, LAST_PRINTABLE_CHAR);
	}

	void SpriteFont::init(const char* font, int size, char cs, char ce) {
		//std::cout << cs << " " << ce << "\n";
		
		// Initialize SDL_ttf
		if (!TTF_WasInit()) {
			TTF_Init();
		}
		TTF_Font* f = TTF_OpenFont(font, size);
		if (f == nullptr) {
			fprintf(stderr, "Failed to open TTF font %s\n", font);
			fflush(stderr);
			throw 281;
		}
		_fontHeight = TTF_FontHeight(f);
		_regStart = cs;
		_regLength = ce - cs + 1;
		int padding = size / 8;

		std::cout << ce << " " << cs << " " <<_regLength;
		// First neasure all the regions
		glm::ivec4* glyphRects = new glm::ivec4[_regLength];
		int i = 0, advance;
		for (char c = cs; c <= ce; c++) {
			TTF_GlyphMetrics(f, c, &glyphRects[i].x, &glyphRects[i].z, &glyphRects[i].y, &glyphRects[i].w, &advance);
			glyphRects[i].z -= glyphRects[i].x;
			glyphRects[i].x = 0;
			glyphRects[i].w -= glyphRects[i].y;
			glyphRects[i].y = 0;
			i++;
		}

		// Find best partitioning of glyphs
		int rows = 1, w, h, bestWidth = 0, bestHeight = 0, area = MAX_TEXTURE_RES * MAX_TEXTURE_RES, bestRows = 0;
		std::vector<int>* bestPartition = nullptr;
		while (rows <= _regLength) {
			h = rows * (padding + _fontHeight) + padding;
			auto gr = createRows(glyphRects, _regLength, rows, padding, w);

			// Desire a power of 2 texture
			w = closestPow2(w);
			h = closestPow2(h);

			// A texture must be feasible
			if (w > MAX_TEXTURE_RES || h > MAX_TEXTURE_RES) {
				rows++;
				delete[] gr;
				continue;
			}

			// Check for minimal area
			if (area >= w * h) {
				if (bestPartition) delete[] bestPartition;
				bestPartition = gr;
				bestWidth = w;
				bestHeight = h;
				bestRows = rows;
				area = bestWidth * bestHeight;
				rows++;
			}
			else {
				delete[] gr;
				break;
			}
		}

		// Can a bitmap font be made?
		if (!bestPartition) {
			fprintf(stderr, "Failed to Map TTF font %s to texture. Try lowering resolution.\n", font);
			fflush(stderr);
			throw 282;
		}
		// Create the texture
		glGenTextures(1, &_texID);
		glBindTexture(GL_TEXTURE_2D, _texID);
		glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bestWidth, bestHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, nullptr);

		// Now draw all the glyphs
		SDL_Color fg = { 255, 255, 255, 255 };
		int ly = padding;
		for (int ri = 0; ri < bestRows; ri++) {
			int lx = padding;
			for (size_t ci = 0; ci < bestPartition[ri].size(); ci++) {
				int gi = bestPartition[ri][ci];

				SDL_Surface* glyphSurface = TTF_RenderGlyph_Blended(f, (char)(cs + gi), fg);

				// Pre-multiplication occurs here
				unsigned char* sp = (unsigned char*)glyphSurface->pixels;
				int cp = glyphSurface->w * glyphSurface->h * 4;
				for (int i = 0; i < cp; i += 4) {
					float a = sp[i + 3] / 255.0f;
					sp[i] = (unsigned char)((float)sp[i] * a);
					sp[i + 1] = sp[i];
					sp[i + 2] = sp[i];
				}

				// Save glyph image and update coordinates
				glTexSubImage2D(GL_TEXTURE_2D, 0, lx, bestHeight - ly - 1 - glyphSurface->h, glyphSurface->w, glyphSurface->h, GL_BGRA, GL_UNSIGNED_BYTE, glyphSurface->pixels);
				glyphRects[gi].x = lx;
				glyphRects[gi].y = ly;
				glyphRects[gi].z = glyphSurface->w;
				glyphRects[gi].w = glyphSurface->h;

				SDL_FreeSurface(glyphSurface);
				glyphSurface = nullptr;

				lx += glyphRects[gi].z + padding;
			}
			ly += _fontHeight + padding;
		}

		// Draw the unsupported glyph
		int rs = padding - 1;
		int* pureWhiteSquare = new int[rs * rs];
		memset(pureWhiteSquare, 0xffffffff, rs * rs * sizeof(int));
		glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, rs, rs, GL_RGBA, GL_UNSIGNED_BYTE, pureWhiteSquare);
		delete[] pureWhiteSquare;
		pureWhiteSquare = nullptr;

		// Set some texture parameters
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

		// Create spriteBatch glyphs
		_glyphs = new CharGlyph[_regLength + 1];
		for (i = 0; i < _regLength; i++) {
			_glyphs[i].character = (char)(cs + i);
			_glyphs[i].size = glm::vec2(glyphRects[i].z, glyphRects[i].w);
			_glyphs[i].uvRect = glm::vec4(
				(float)glyphRects[i].x / (float)bestWidth,
				(float)glyphRects[i].y / (float)bestHeight,
				(float)glyphRects[i].z / (float)bestWidth,
				(float)glyphRects[i].w / (float)bestHeight
				);
		}
		_glyphs[_regLength].character = ' ';
		_glyphs[_regLength].size = _glyphs[0].size;
		_glyphs[_regLength].uvRect = glm::vec4(0, 0, (float)rs / (float)bestWidth, (float)rs / (float)bestHeight);

		glBindTexture(GL_TEXTURE_2D, 0);
		delete[] glyphRects;
		delete[] bestPartition;
		TTF_CloseFont(f);
	}

	void SpriteFont::dispose() {
		if (_texID != 0) {
			glDeleteTextures(1, &_texID);
			_texID = 0;
		}
		if (_glyphs) {
			delete[] _glyphs;
			_glyphs = nullptr;
		}
	}

	std::vector<int>* SpriteFont::createRows(glm::ivec4* rects, int rectsLength, int r, int padding, int& w) {
		// Blank initialize
		std::vector<int>* l = new std::vector<int>[r]();
		int* cw = new int[r]();
		for (int i = 0; i < r; i++) {
			cw[i] = padding;
		}

		// Loop through all glyphs
		for (int i = 0; i < rectsLength; i++) {
			// Find row for placement
			int ri = 0;
			for (int rii = 1; rii < r; rii++)
				if (cw[rii] < cw[ri]) ri = rii;

			// Add width to that row
			cw[ri] += rects[i].z + padding;

			// Add glyph to the row list
			l[ri].push_back(i);
		}

		// Find the max width
		w = 0;
		for (int i = 0; i < r; i++) {
			if (cw[i] > w) w = cw[i];
		}

		return l;
	}

	glm::vec2 SpriteFont::measure(const char* s) {
		glm::vec2 size(0, _fontHeight);
		float cw = 0;
		for (int si = 0; s[si] != 0; si++) {
			char c = s[si];
			if (s[si] == '\n') {
				size.y += _fontHeight;
				if (size.x < cw)
					size.x = cw;
				cw = 0;
			}
			else {
				// Check for correct glyph
				int gi = c - _regStart;
				if (gi < 0 || gi >= _regLength)
					gi = _regLength;
				cw += _glyphs[gi].size.x;
			}
		}
		if (size.x < cw)
			size.x = cw;
		return size;
	}

	void SpriteFont::draw(SpriteBatch& batch, const char* s, glm::vec2 position, glm::vec2 scaling,
		float depth, ColorRGBA8 tint, Justification just /* = Justification::LEFT */) {
		glm::vec2 tp = position;
		// Apply justification
		if (just == Justification::MIDDLE) {
			tp.x -= measure(s).x * scaling.x / 2;
		}
		else if (just == Justification::RIGHT) {
			tp.x -= measure(s).x * scaling.x;
		}
		for (int si = 0; s[si] != 0; si++) {
			char c = s[si];
			if (s[si] == '\n') {
				tp.y += _fontHeight * scaling.y;
				tp.x = position.x;
			}
			else {
				// Check for correct glyph
				int gi = c - _regStart;
				if (gi < 0 || gi >= _regLength) gi = _regLength;
				glm::vec4 destRect(tp, _glyphs[gi].size * scaling);
				batch.draw(destRect, _glyphs[gi].uvRect, _texID, depth, tint);
				tp.x += _glyphs[gi].size.x * scaling.x;
			}
		}
	}

}

Problem

My problem now is, as I said earlier, that the program can't render Umlaute. I know, that I can set the range (ASCII) with FIRST_PRINTABLE_CHAR and LAST_PRINTABLE_CHAR, but if I set LAST_PRINTABLE_CHAR to 154, that all Umlaute are included, my program crashes...

Also it crashes, if i just set the value from 126 to 127...

 

 

Appendix

Because it's my first game I coded the first part of my game with help of a YT-Tutorial playlist, the SpriteFont-Class is also from it. Here is a link to the source code of the whole program (if you need it): https://github.com/Barnold1953/GraphicsTutorials (The right game is "Ninja Platformer", but the SpriteFont-Class is in the "Bengine" folder

 

Hope, that someone can help me.

Thanks

Share this post


Link to post
Share on other sites
Advertisement

Are you certain that the font you're loading has glyphs for the characters you're trying to render? Check the return value of TTF_GlyphMetrics.

Share this post


Link to post
Share on other sites

Yes, the font has the glyphs, 100% sure.

 

The return value of TTF_GlyphMetrics is always 0 (until it crashes)

Share this post


Link to post
Share on other sites

Oh yes, thats true...

Can you say me also, what I have to change, if I change FIRST_PRINTABLE_CHAR & last one to uint8_t?

 

And also, how would I call all of the draw-Methods then?

Till now I made a std::string and converted it with string.c_str()

 

What I have tried:

Change the chars to uint8_ts and then I pass in a vector to the function like this:

std::vector<uint8_t> myVector(heading.begin(), heading.end());
uint8_t *p = &myVector[0];
// Spritefont draw with p

The problem is there, that the higher letters are not right anyway...

E.g. a "ü" is 3...

Edited by Gykonik

Share this post


Link to post
Share on other sites

You can just replace the line

  char c = s[si];

in draw() and measure() by

  uint8_t c = s;

 

The problem with the characters still being wrong is probably an encoding issue.

Using one unsigned char like that for accented latin characters would work if your input string was encoded in ISO 8859 format, but nowadays it is probably UTF-8. It depends where your string comes from really, if it's directly from the C++ source itself it may be encoded as pretty much anything depending on the compiler.

But it's likely to be UTF-8, which encodes all characters with a code higher than 127 as two characters.

https://en.wikipedia.org/wiki/UTF-8

 

So (if your text is indeed in UTF-8) you have to decode that to get the proper character codes. It's pretty easy to do but there are also libraries around (such as http://utfcpp.sourceforge.net/) that can do it for you.

Share this post


Link to post
Share on other sites

Is there a way, how I can check the type?

I send in the string like this:

std::string test = "This is a test";
// pass in to SpriteFont.draw with test.c_str()

Share this post


Link to post
Share on other sites

Try this:

 

std::string test( "é" );

cout << test.size() << endl;

 

If it displays 2, then the string is very likely encoded in UTF-8. If it displays 1, then it's probably ISO-8859-1.

 

But if you want to know if there's a 100% sure fire way to programmatically know the encoding of a string you're given, there's none. If it's a string literal, you have to make an assumption about what the compiler will give you. If you read it from a text file, you have to make sure of the encoding the file was saved as.

 

This is why in some text formats such as XML, there is a header that explicitly indicates the encoding. (for instance <?xml version="1.0" encoding="ISO-8859-1"?>)

Edited by Zlodo

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!