Render overlays using GDI+ on a Direct3D9 app

Started by
21 comments, last by dario_ramos 14 years, 4 months ago
Hi everyone, I have a Direct3D9 application which loads images onto textures, and my goal is to be able to draw overlay shapes (rectangles, circles, polygons, and so on). At first, I tried to do it using D3D, but I was new to D3D, and the legacy code was very nasty. And so, in the D3D render cycle, after rendering the image, I did this:

// Inside render loop (i.e. between BeginScene and EndScene)
while ( !m_RectangleBuffer.empty() ){
   t_DrawRectangleCall drc = m_RectangleBuffer.front();
   Gdiplus::Color penColor;
   Gdiplus::Pen pen(Gdiplus::Color::Black);
   HDC hdc;
   m_pBackBuffer->GetDC(&hdc);
   Gdiplus::Graphics g(hdc);
   g.SetSmoothingMode( Gdiplus::SmoothingModeAntiAlias );
   Gdiplus::Matrix transform;
   transform.SetElements( m_WorldTransform.eM11, m_WorldTransform.eM12, 
                          m_WorldTransform.eM21, m_WorldTransform.eM22, 
                          m_WorldTransform.eDx, m_WorldTransform.eDy );
   g.SetTransform( &transform );
   // Render rectangle
   penColor.SetFromCOLORREF( RGB(drc.color.r, drc.color.g, drc.color.b) );
   pen.SetColor( penColor );
   g.DrawRectangle(&pen, (INT)drc.x, (INT)drc.y,
                  (INT)drc.dwWidth, (INT)  drc.dwHeight);
   m_pBackBuffer->ReleaseDC(hdc);
   m_RectangleBuffer.pop();
}

I know it's nasty but it worked. However, it's too slow, since I had to make the back buffer lockable and call IDirect3DSurface9::GetDC() for each shape. Moreover, now that I'm trying to program a collimator using a similar approach (using GDIPlus::GraphicsPath), the performance becomes unacceptable. From what I gathered from the web, a better approach would be rendering all shapes on a Gdiplus::Bitmap instead of directly on the back buffer, and later, somehow, attach that bitmap to the backbuffer all at once (this might require alpha blending and IDirect3DSurface9::GetDC() has problems with that...). Problem is... How can I do that? Been fooling around with a D3D sample for hours, googled everywhere and found only vague descriptions and documentation.
Advertisement
This is why Direct2D was invented; unfortunately it's only on Vista+ (Or Windows 7+, I''m not sure offhand).

If you need to alpha blend against 3D, then it's going to be slow - you need to render 3D, serialize the CPU and GPU, then lock the backbuffer, transfer the contents to the CPU, do the rendering/blending, transfer the data back to the GPU, and unlock the backbuffer.

It's easier if you don't need the alpha blending, you can create a D3DPOOL_MANAGED texture with one mip level, render into that with GDI+ (Either with GetDC() or by memcpy()ing in from a DIB section), and then render a textured quad using that texture. You can get better performance but introduce some latency in the 2D by using two or more textures in a round-robin fashion to maximize the time the display driver has to transfer the contents to the GPU.
Hi Steve,
Thanks for your reply!
Direct2D is cool, but alas, my target will be XP for a long time.
I'm not sure I could follow you very well, since I'm still a rookie in Direct3D. Any pointers to good literature/reference will be appreciated (the SDK documentation is lacking at times).
If I understood, you're suggesting that if alpha blending is unavoidable, I should render using GetDC(). But isn't that what I'm doing, and whitout alpha blending? The problem with that approach is that it's too slow. Please correct me if I'm wrong.
If I won't be doing GetDC(), I need to render "offline" on a texture or something similar. But that brings the problem of alpha blending. What some guy suggested on another forum is this:

(excerpted from http://www.eggheadcafe.com/software/aspnet/29542140/idirect3dsurface9getdc.aspx)

- render to a color bitmap with GDI+
- get the DIB section for the bitmap
- get the surface for a level of an A8R8G8B8 texture
- use D3DXLoadSurfaceFromMemory with an appropriate colorkey parameter
to load the surface
- generate mipmaps if necessary
- draw with the texture

But I could't find a way to do all this. Maybe I'm just burn out, gonna take a rest and try again tomorrow.
Quote:Original post by dario_ramos
If I understood, you're suggesting that if alpha blending is unavoidable, I should render using GetDC(). But isn't that what I'm doing, and whitout alpha blending? The problem with that approach is that it's too slow. Please correct me if I'm wrong.
If you need alpha support from your GDI+ stuff, then its going to be more difficult. GetDC() doesn't support alpha, so you'd have to create a DIB section to get access to the raw pixels coming out from GDI+, then Lock() a D3D surface, copy the data across (possibly doing some conversion), and then Unlock() the surface.

If your GDI+ stuff doesn't need to use alpha, then you can use GetDC() to get access to a D3D surface directly, and avoid the lock, copy and unlock step. Note that even colour keying is still alpha (1 bit alpha).

Quote:Original post by dario_ramos
If I won't be doing GetDC(), I need to render "offline" on a texture or something similar. But that brings the problem of alpha blending. What some guy suggested on another forum is this:

(excerpted from http://www.eggheadcafe.com/software/aspnet/29542140/idirect3dsurface9getdc.aspx)

- render to a color bitmap with GDI+
- get the DIB section for the bitmap
- get the surface for a level of an A8R8G8B8 texture
- use D3DXLoadSurfaceFromMemory with an appropriate colorkey parameter
to load the surface
- generate mipmaps if necessary
- draw with the texture

But I could't find a way to do all this. Maybe I'm just burn out, gonna take a rest and try again tomorrow.
That looks good to me (The D3DXLoadSurfaceFromMemory call will internally do the lock, copy, unlock, and colour keying).

What part are you having difficulty with?
To create a DIB section, you'll want to use CreateDIBSection. That gives you a handle to a bitmap, and gives you raw access to the pixels. You can then select that HBITMAP into a device context, render as usual, then call GdiFlush. After that, you can access the pixels in the pointer passed to the CreateDIBSection() call.
To get the surface level for a texture, you want to call GetSurfaceLevel(0, ...) on that texture. That gives you an IDirect3DSurface9* interface back. Make sure you Release() the surface when you're done with it, since it increments the reference count on the parent texture.
The docs for D3DXLoadSurfaceFromMemory should be sufficient for that step.
When you create the texture in the first place, you only want one mip level, unless you're going to apply it to 3D geometry, so you can probably skip the mipmap step.
To draw with the texture, set the texture in stage 0 (SetTexture(0, pTexture)), then draw a screen space quad with the texture coordinates set appropriately.
Hi Steve,
I've done all those steps, but it still doesn't do what I want (at least it doesn't crash [grin]).
I took one of the D3D samples (the 2nd one from the D3D tutorial, I believe). The one in which a lit triangle is drawn over a blue background.
I'm expecting to see a red line drawn across the whole screen (keeping the triangle, of course), but I see a black rectangle on the top left corner and the triangle remains, but black, and there's no line. Here's the code for the six steps mentioned before:


  1. Create bitmap

  2. Select bitmap into a compatible dc

  3. Render as usual

  4. Update the texture's surface

  5. Set the texture on stage 0

  6. Render a screen textured quad



Steps 1-2: After initializing the device...
// TODO: GDI+ stuffGdiplusStartup(>,&gsi,NULL);// 1 - Create bitmapHDC g_OfflineHDC = CreateCompatibleDC( ::GetDC(::GetActiveWindow()) );BITMAPINFO bmi;ZeroMemory(&bmi, sizeof(BITMAPINFO));bmi.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);bmi.bmiHeader.biWidth = 128;bmi.bmiHeader.biHeight = 128;bmi.bmiHeader.biPlanes = 1;bmi.bmiHeader.biBitCount = 32; // Bytes per pixel: 4bmi.bmiHeader.biCompression = BI_RGB;bmi.bmiHeader.biSizeImage = bmi.bmiHeader.biWidth * bmi.bmiHeader.biHeight * 4; // Bytes per pixel: 4HBITMAP hBitmap = CreateDIBSection( g_OfflineHDC, &bmi, DIB_RGB_COLORS, &g_pDIBData, NULL, 0x0);// 2 - Select the bitmap into the DCSelectObject(g_OfflineHDC, hBitmap);


Steps 3-4: Anywhere outside render loop (I wanna render offline)
// 3 - Render as usualGdiplus::Graphics g( g_OfflineHDC);Gdiplus::Pen pen( Gdiplus::Color::Red );g.Clear( Gdiplus::Color::Black );g.DrawLine(&pen, 0, 0, 128, 128);// 4 - Update the texture's surfaceLPDIRECT3DSURFACE9	pOORTSurface = NULL;g_pOORT->GetSurfaceLevel(0, &pOORTSurface);RECT rect;rect.left = 0;		rect.top = 0;rect.right = 128;	rect.bottom = 128;D3DXLoadSurfaceFromMemory(        pOORTSurface,	NULL,	NULL,	(LPCVOID)g_pDIBData,	D3DFMT_A8R8G8B8,	128*4,	// Bytes per pixel: 4	NULL,	&rect,	D3DX_DEFAULT,	D3DCOLOR_ARGB(255, 0, 0, 0) ); //Replace black with transparent blackpOORTSurface->Release();


Steps 5-6: Inside render loop, between BeginScene() and EndScene():
// 5 - Set the texture on stage 0g_pd3dDevice->SetTexture(0, g_pOORT);// 6 - Render a screen space quadg_pd3dDevice->SetStreamSource( 0, g_pVBTQ, 0, sizeof(CUSTOMVERTEX2) );g_pd3dDevice->SetFVF( D3DFVF_CUSTOMVERTEX2 );g_pd3dDevice->DrawPrimitive( D3DPT_TRIANGLEFAN, 0, 2 );


I suspect it has something to do with my vertex setup, so here it is (this is also done in the initialization stage):
CUSTOMVERTEX2 Vertices2[] ={	{0, 0, 0, 1.0f, D3DCOLOR_ARGB(0,0,0,0), 0, 0 },	//x, y, z, rhw, colour, u, v	{128.0f, 0, 0, 1.0f, D3DCOLOR_ARGB(0,0,0,0), 1.0f, 0 },	{128.0f, 128.0f, 0, 1.0f, D3DCOLOR_ARGB(0,0,0,0), 1.0f, 1.0f },	{0, 128.0f, 0, 1.0f, D3DCOLOR_ARGB(0,0,0,0), 0, 1.0f }};if( FAILED( g_pd3dDevice->CreateVertexBuffer(	4 * sizeof( CUSTOMVERTEX2 ),    D3DUSAGE_WRITEONLY, //TODO: Fixes D3D warning    D3DFVF_CUSTOMVERTEX2,    D3DPOOL_DEFAULT,    &g_pVBTQ,    NULL ) ) ){        return E_FAIL;}VOID* pVertices2;if( FAILED( g_pVBTQ->Lock( 0, sizeof( Vertices2 ), ( void** )&pVertices2, 0 ) ) )    return E_FAIL;memcpy( pVertices2, Vertices2, sizeof( Vertices2 ) );g_pVBTQ->Unlock();


Damn! That was a long post [lol]
I corrected a few things:
- I called GdiFlush() right after calling DrawLine
- I set the textured quad's size and all related stuff to 300x300 (the window size, which I supposed was 128x128). As expected, now I get a black screen (the textured quad covers the whole window).

I think the problem is that what I do in the Gdiplus::Graphics doesn't affect the pixel data. I thought associating it with the hdc was enough?

Edit: Maybe I wasn't clear... It still doesn't work! I don't wanna see a black texture covering the screen, I wanna see a red line drawn across the triangle (that's why I Clear()'ed black the Graphics, drew a red line and told D3DXLoadSurfaceFromMemory to replace opaque black with transparent black).

[Edited by - dario_ramos on November 20, 2009 11:06:57 AM]
Bump! I still can't get it to work... Any ideas?
You'll need to narrow down what part is failing. After rendering, you could save the bitmap bits out to a file, and see if the image is there. If it's not, then you know something is wrong in the GDI rendering part.

If that works, you can use D3DXSaveSurfaceToFile() to check that D3DXLoadSurfaceFromMemory() is working as expected, and/or D3DXSaveTextureToFile() to check that the texture is in the state you expect.

If you get that far and the texture seems valid, then there must be a problem with the rendering; texture coordinates lighting, etc etc.

Also, the Debug Runtimes are useful for checking that D3D functions are all succeeding as expected.
Alright, so now I'm failing to save the bitmap to a file:

void RenderOverlays(){	// 3 - Render as usual	Gdiplus::Graphics g( g_OfflineHDC );	g.Clear( Gdiplus::Color::Yellow );	Gdiplus::Pen pen( Gdiplus::Color::Red );	g.DrawLine(&pen, 10, 10, 120, 120);	::GdiFlush();	// 3' - Send bitmap to file to see if it came out right	Gdiplus::Bitmap b(300, 300, &g);	CLSID pngClsid;	GetEncoderClsid(L"image/png", &pngClsid);	Gdiplus::Status s = b.Save(L"tmp.png", &pngClsid, NULL);	if (s != Gdiplus::Ok )		throw std::logic_error("Couldn't save bitmap");	// 4 - Update the texture's surface	LPDIRECT3DSURFACE9	pOORTSurface = NULL;	g_pOORT->GetSurfaceLevel(0, &pOORTSurface);	RECT rect;	rect.left = 0;		rect.top = 0;	rect.right = 300;	rect.bottom = 300;	D3DXLoadSurfaceFromMemory(		pOORTSurface,		NULL,		NULL,		(LPCVOID)g_pDIBData,		D3DFMT_A8R8G8B8,		300*4,	// Bytes per pixel: 4		NULL,		&rect,		D3DX_DEFAULT,		D3DCOLOR_ARGB(255, 0, 0, 0) ); //Replace black with transparent black	pOORTSurface->Release();}


The call to Bitmap::Save returns "Invalid parameter". The class id is apparently OK, since it isn't -1 and really looks like a class id. What could be wrong with the filename?

PS: Sorry for the delay, I had my hands full with urgent stuff
That code runs fine for me - although it gives an empty image (300x300px, transparent), and changing it to save a BMP gives a 300x300 black image.

EDIT: Actually, I don't think that code does what you think - You create a bitmap, and then save it, but that bitmap is completely unrelated to anything being drawn on the Graphics.
I've not really used GDI+ much unfortunately, so I don't know the correct way to get the image the Graphics class is drawing onto.

This topic is closed to new replies.

Advertisement