# How can I display Unicode characters like "ä" or "?" in my game?

## Recommended Posts

Hi,

I want to release my game in different languages and I don't know how to display all the characters of different languages, including English, German, Chinese, Swedish and so on. I have figured out to display ASCII characters by putting them into a texture, but it is too much work to manually add all of the characters I need to one (or more) texture(s).

Is there a solution for this problem? Can I do it automatically by rendering the characters SVG files into a texture and loading this texture into the game? But how can I do this? Or is there a simplier solution?

Greetings,

Magogan

PS: Sorry for my bad english :D

##### Share on other sites

A quick Google search gave me this:

http://sourceforge.net/projects/ftgl/

Edit: My bad, it turns out to be for OpenGL, not sure if it works with DX

Edit2: Another quick search gave me this:

DirectXTK (check the SpriteFont class)

http://directxtk.codeplex.com/

Edited by burhanloey

##### Share on other sites

Yes, but I use DirectX 11, not OpenGL. Additionally, this project was last updated in 2008 and Unicode was last updated in June 2014 (see http://en.wikipedia.org/wiki/Unicode#Versions)...

##### Share on other sites

The link is in my previous post.

##### Share on other sites
There are plenty of tools that can render any arbitrary glyphs supported by FreeType or other font engines into a texture. Or you can just use FreeType directly, if you need to support more characters than is reasonable for a bitmapped font.

AngelCode BMFont is a popular such tool.

##### Share on other sites

I want to release my game in different languages and I don't know how to display all the characters of different languages, including English, German, Chinese, Swedish and so on. I have figured out to display ASCII characters by putting them into a texture, but it is too much work to manually add all of the characters I need to one (or more) texture(s).

Is there a solution for this problem? Can I do it automatically by rendering the characters SVG files into a texture and loading this texture into the game? But how can I do this? Or is there a simplier solution?

In general you need to specify ranges of characters, use something like FreeType to render their glyphs, use a bin packing algorithm to place the rendered glyphs into a texture, and store the texture along with a map of character codes (in Unicode, of course) to image clip regions. All this can be done programmatically with the exception of range specification. Tools like the one mentioned by Sean do this for you.

BTW: Font files and SVG are different things, although both may use paths based on the same (or at least similar) primitives.

##### Share on other sites

DirectWrite uses a 2D render target. I don't know if and how I can use this for a 3D game with a 3D render target...

SpriteFont uses Windows 8.1 SDK and DirectX 11.1 math. Will it work with Windows Vista, Windows 7 and Windows 8(.1) and on graphic cards that only support DirectX 11?

And is there a way to get the size of the text before it is rendered? I need to know the width of each word (or letter) to implement scrolling, so how can I do this with DirectWrite or SpriteFont?

Edited by Magogan

##### Share on other sites

DirectWrite uses a 2D render target. I don't know if and how I can use this for a 3D game with a 3D render target...

Combining DirectWrite/Direct2D with D3D11 is definitely possible. I used them both for a project a while ago (it used SharpDX, but that doesn't really make a difference). Unfortunately, I don't have access to that project right now.

SpriteFont uses Windows 8.1 SDK and DirectX 11.1 math. Will it work with Windows Vista, Windows 7 and Windows 8(.1) and on graphic cards that only support DirectX 11?

DirectXTK should be usable on all platforms that support DX11 (you can still use a lower feature level). According to the Codeplex page, Vista SP2 and higher are supported.

And is there a way to get the size of the text before it is rendered? I need to know the width of each word (or letter) to implement scrolling, so how can I do this with DirectWrite or SpriteFont?

SpriteFont provides a MeasureString method that returns the size of a given string in pixels.

DirectWrite allows you to do get all kinds of information on the text to be rendered and implements many formatting options.

##### Share on other sites
I found a solution for rendering DirectWrite to a D3D 10.1 texture: http://msdn.microsoft.com/en-us/library/windows/desktop/dd370966(v=vs.85).aspx#example__use_direct2d_content_as_a_texture

But I don't know how to access this texture with D3D 11. Is there a way to convert a D3D 10.1 texture into a D3D 11 texture? I cannot "simply" draw directly to the surface because I want to use text on signs located somewhere in the 3D world.

##### Share on other sites

An user also recently posted a TTF library

http://www.gamedev.net/topic/659230-font-rendering/

perhaps you could take a look at that as well. I think he would appreciate.

Otherwise, DirectWrite.

Note I don't suggest to use FreeType: it renders the glyphs very well but you're still on your own when it comes to Unicode layout (ouch!). OS typography functions needed here.

##### Share on other sites
#include "D2DClass.h"

ID2D1Factory* D2DClass::m_ID2D1Factory;
ID2D1HwndRenderTarget* D2DClass::m_pRenderTarget;
ID3D10Device1* D2DClass::m_D3D10Device1;

bool D2DClass::Initialize(ID3D11Device* pDevice11){

HRESULT hr = D2D1CreateFactory(D2D1_FACTORY_TYPE_SINGLE_THREADED, __uuidof(ID2D1Factory), NULL, (void**)(&m_ID2D1Factory));
if (FAILED(hr)){
return false;
}

IDXGIDevice* pDXGIDevice;

pDevice11->QueryInterface<IDXGIDevice>(&pDXGIDevice);

SAFE_RELEASE(pDXGIDevice);

hr = D3D10CreateDevice1(pAdapter, D3D10_DRIVER_TYPE_HARDWARE, NULL, D3D10_CREATE_DEVICE_BGRA_SUPPORT | D3D10_CREATE_DEVICE_DEBUG, D3D10_FEATURE_LEVEL_10_1, D3D10_1_SDK_VERSION, &m_D3D10Device1);

return true;
}

bool D2DClass::CreateTextureRenderTarget(ID3D11Device* pDevice11, int TextureWidth, int TextureHeight, ID2D1RenderTarget **RenderTarget, ID3D11Texture2D **D3D11Texture2D){
D3D10_TEXTURE2D_DESC tDesc;
tDesc.Width = TextureWidth;
tDesc.Height = TextureHeight;
tDesc.MipLevels = 1;
tDesc.ArraySize = 1;
tDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
tDesc.SampleDesc.Count = 1;
tDesc.SampleDesc.Quality = 0;
tDesc.Usage = D3D10_USAGE_DEFAULT;
tDesc.BindFlags = D3D10_BIND_RENDER_TARGET | D3D10_BIND_SHADER_RESOURCE;
tDesc.CPUAccessFlags = 0;
tDesc.MiscFlags = D3D10_RESOURCE_MISC_SHARED;

ID3D10Texture2D *D3D10Texture2D;

HRESULT hr=m_D3D10Device1->CreateTexture2D(&tDesc, NULL, &D3D10Texture2D);

//Get DXGI Resource and retrieve the sharing handle.

IDXGISurface *pDXGISurf;
IDXGIResource *pDXGIRes;

HANDLE ShareHandle;

hr=D3D10Texture2D->QueryInterface<IDXGISurface>(&pDXGISurf);

hr=pDXGISurf->QueryInterface<IDXGIResource>(&pDXGIRes);

hr= pDXGIRes->GetSharedHandle(&ShareHandle);

SAFE_RELEASE(pDXGIRes);
SAFE_RELEASE(pDXGISurf);

ID3D11Resource *pD3D11Res;
hr= pDevice11->OpenSharedResource(ShareHandle,__uuidof(ID3D11Resource), (void**)&pD3D11Res);
hr= pD3D11Res->QueryInterface<ID3D11Texture2D>(D3D11Texture2D);

SAFE_RELEASE(pD3D11Res);

IDXGISurface1* pRT10;
hr=(*D3D11Texture2D)->QueryInterface<IDXGISurface1>(&pRT10);

FLOAT dpiX;
FLOAT dpiY;
m_ID2D1Factory->GetDesktopDpi(&dpiX, &dpiY);

// Create a DC render target.

D2D1_RENDER_TARGET_PROPERTIES props = D2D1::RenderTargetProperties(

D2D1_RENDER_TARGET_TYPE_DEFAULT,

D2D1::PixelFormat(DXGI_FORMAT_UNKNOWN, D2D1_ALPHA_MODE_IGNORE),

static_cast<float>(dpiX),

static_cast<float>(dpiY)

);

hr = m_ID2D1Factory->CreateDxgiSurfaceRenderTarget(pRT10, (CONST D2D1_RENDER_TARGET_PROPERTIES *)&props, RenderTarget);
return true;
}


CreateDxgiSurfaceRenderTarget return E_INVALIDARG, but I can't figure out why. Does anyone have an idea where the error in this code could be? The other functions all return S_OK...

I found the code at http://xboxforums.create.msdn.com/forums/p/103939/615166.aspx and modified it a little bit.

(I first call Initialize and then CreateTextureRenderTarget in my application.)

Edit: Even with the code from http://www.gamedev.net/topic/639578-d2d-d3d11-help-me-to-make-them-work-together/#entry5039276 I get E_INVALIDARG

What am I doing wrong?

Edit: IT WORKS!!! I just had to set D3D11_CREATE_DEVICE_BGRA_SUPPORT when creating the D3D11 device.

Edited by Magogan

##### Share on other sites

Keep in mind that the Platform Update might not be installed everywhere... it must be installed manually, or it is installed automatically with IE 11 (but who uses IE anymore? ). It is not compatible with any of the debugging tools from the DirectX 2010 SDK (only with the Windows 8 SDK) , and there is also a list of "Known issues" on the MSDN download page for it. I had some issues with it breaking PIXRun so I decided to remove it - IIRC that was because it removed the DX10 debug layer DLL, and when I added it back it didn't work. But now I'm considering using it again, because DirectWrite is way better than bitmap fonts.

Here's hoping DirectX 12 will not have the same problems.

hr = D3D10CreateDevice1

Sorry, I thought you were going to try rendering directly to the DX11 device with the Platform Update... Maybe that was a different thread. :)

Edited by tonemgub

##### Share on other sites

I advice strongly against bitmap fonts for unicode, as the files can and do get huge. Also, if you do end up using a bitmap font, you must either use monospace or embed kerning information along with the clips. Otherwize, your text will look terribly ugly. IIRC, some of the libraries you were pointed at do use kerning, but it's been some time since I poked in their code base.

I've been using DirectWrite for quite a while, and it can achieve amazing results (you can duplicate probably everything Photoshop can do with fonts). Font rendering is superb, it can render to a transparent background without graphical artifacts (something most font rendering APIs can't do, or can do only in software).

Regardless of whether you use FreeType (haven't used that, but from looking at the API it doesn't seem to be capable of producing outlines, gradients or other effects like that, just shadows that any rendering API can do), DirectWrite or something else, I advise you to render text to texture. Then you can transform it any way you like, or apply it wherever you need it. If you cache all text during level loading, rendering may end up faster than blitting individual glyphs. You can do transforms for bitmap fonts as well, but the code is more complicated than just passing a texture to the GPU.

DirectXTK is a nice little library with an added bonus that it works on Xbox One. I *think* it uses kerning (may be called advances or other names), you need to look at the code to be sure. It also supports loading textures and doing some simple 2D, which is a nice bonus for game UI.

## Create an account or sign in to comment

You need to be a member in order to leave a comment

## Create an account

Sign up for a new account in our community. It's easy!

Register a new account

• ### Forum Statistics

• Total Topics
628403
• Total Posts
2982477

• 9
• 10
• 9
• 21
• 24