Font rendering using GDI & D3D10

Started by
5 comments, last by Michael Anthony Wion 11 years, 11 months ago
I've heard from multiple sources that the best way to crank out performance while rendering 2D text to the screen is to roll out your own font class. The font class should hold a cache of text being drawn to the screen, and only create new resources when the specified text does not already exist within the cache. This makes perfect sense to me.

As for the implementation, I figured I would create the font & text using GDI (or possibly GDI+), then slap it onto a texture in D3D, freeing GDI resources as I go along. I've also read that this is basically how ID3DXFont does the job. My question, however, is how would you get the image from a GDI device context onto a ID3D10ShaderResourceView texture? And would it be better to use GDI+ instead of GDI?

Thank you for your thoughts.
Advertisement
So just as a general test of ideas, I tried this:


bool TextClass::Initialize(ID3D10Device* device)
{
int bmpWidth = 64;
int bmpHeight = 16;
LPCWSTR strText = TEXT("Test");
RECT rcText = { 0, 0, bmpWidth, bmpHeight };
COLORREF hbrColor = RGB(255, 0, 0);
HDC hDC = CreateCompatibleDC(NULL);
DWORD* pSrcData = 0;
BITMAPINFO bmi = { sizeof( BITMAPINFOHEADER ), bmpWidth, bmpHeight, 1, 32, BI_RGB, 0, 0, 0, 0, 0 };
HBITMAP hTempBmp = CreateDIBSection( hDC, &bmi, DIB_RGB_COLORS, (void**)&pSrcData, NULL, 0 );
HFONT NewFont = CreateFont(0, 0, 0, 0, FW_DONTCARE, FALSE, FALSE, FALSE, DEFAULT_CHARSET, OUT_OUTLINE_PRECIS, CLIP_DEFAULT_PRECIS, CLEARTYPE_QUALITY, VARIABLE_PITCH, TEXT("Impact"));
HBRUSH NewBrush = CreateSolidBrush(hbrColor);
SelectObject(hDC, NewFont);
SelectObject(hDC, NewBrush);
DrawText(hDC, strText, 4, &rcText, DT_LEFT | DT_WORDBREAK);
GdiFlush();
DeleteObject(NewBrush);
DeleteObject(NewFont);
ReleaseDC(NULL, hDC);
D3DX10_IMAGE_LOAD_INFO info;
info.Width = (rcText.right - rcText.left);
info.Height = (rcText.bottom - rcText.top);
info.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
info.Usage = D3D10_USAGE_DEFAULT;
info.MipLevels = 1;
info.Depth = 1;
info.CpuAccessFlags = 0;
info.BindFlags = D3D10_BIND_SHADER_RESOURCE;
info.MiscFlags = 0;
info.Filter = 0;
info.FirstMipLevel = 1;
info.MipFilter = 0;
info.pSrcInfo = 0;
if(FAILED( D3DX10CreateTextureFromMemory( device, pSrcData, (4 * info.Width) * info.Height, &info, NULL, &m_pTexture, NULL) ))
{
return false;
}
D3D10_SHADER_RESOURCE_VIEW_DESC srvDesc;
SecureZeroMemory(&srvDesc, sizeof(D3D10_SHADER_RESOURCE_VIEW_DESC));
m_pTexture2D = (ID3D10Texture2D*)m_pTexture;
srvDesc.Format = info.Format;
srvDesc.ViewDimension = D3D10_SRV_DIMENSION_TEXTURE2D;
srvDesc.Texture2D.MostDetailedMip = info.MipLevels;
if(FAILED( device->CreateShaderResourceView( m_pTexture, &srvDesc, &m_pSRView ) ))
{
return false;
}
return true;
}

... But it always fails on D3DX10CreateTextureFromMemory. I don't know if this is a compatibility issue or if I'm just going about this improperly, but AFAIK it looks like should work.
I use http://fw1.codeplex.com/ (includes the source code in case you really want to roll your own)

Easy to use library (make sure you enable the "restore state" option). Unless you're purely doing 2d font rendering performance doesn't really matter here.

... But it always fails on D3DX10CreateTextureFromMemory. I don't know if this is a compatibility issue or if I'm just going about this improperly, but AFAIK it looks like should work.


D3DX10CreateTextureFromMemory expects an image file that has been loaded into memory, not a pointer to the raw texture data. For that, just create a Texture2D and pass the image data through the pInitialData parameter of CreateTexture2D.

I have a really basic bitmap font generator that I use for my samples, which I only wrote because at the time there was nothing else available for D3D11. It's really really basic and frankly not very great, but you can download it from one of the samples on my blog if you want a rough idea of how to generate a bitmap font texture using GDI+ and how to render it with a sprite renderer. For an actually good reference you can check FW1 as the above poster suggests, or you can check out the new D3D11 SpriteFont class created by Shawn Hargreaves.
I'm using D3D10, so if it were possible to use those suggestions then I'd imagine it being a rather painful rewriting process.


For that, just create a Texture2D and pass the image data through the pInitialData parameter of CreateTexture2D.

Thanks! I was able to get the fonts generated, cached, and even added into a resource view. When I save the bitmap data out to a file, it comes out precisely as expected.
But when I try to render the resource view, I see nothing being rendered. Is there a simple way to output the texture data to a file from a resource?
Just trying to make sure the data is actually copying over properly before continuing on. If it saves as expected, then it's probably just my matrix/shader setup.
You can use D3DX10SaveTextureToFile, or you can just use PIX to capture a frame and inspect the texture.
Thanks! It appears to have copied over properly, except the image is rendered upside-down so I assume I'll need to reverse the pixel order in the GDI image before copying, unless there's a more efficient way to handle this.

EDIT: In the BITMAPINFOHEADER, when you set biHeight to be negative, it reverses the image along the y-axis.
All is good, thanks again!

This topic is closed to new replies.

Advertisement