Direct3D 11 Creating GDI compatible texture for backbuffer

Started by
4 comments, last by Buckeye 9 years, 12 months ago

I'm relatively new to D3D11 and I'm trying to take advantage of IDXGISurface1->GetDC to do some GDI drawing on the surface before Present. So far, I've been unsuccessful.

Can someone set me straight?

Looking at the docs for GDI compatible textures and some old posts here on gamedev, I think I'm meeting the requirements. However, when I call g_pSurface1->GetDC, I get an error:


DXGI ERROR: IDXGISurface1::GetDC: GetDC can only be called for textures that were created
with the D3D10_RESOURCE_MISC_GDI_COMPATIBLE flag. [ MISCELLANEOUS ERROR #89: ]

For one thing, I don't understand the reference to the D3D10 flag (?). My understanding is that CreateDeviceAndSwapChain automatically creates a texture for the backbuffer.

Here's the code I use to create the device/swapchain:


    IDXGIFactory1 *pDXGIFactory;
    HRESULT hResult = CreateDXGIFactory1(__uuidof(IDXGIFactory1), (void**)&pDXGIFactory);
    if (FAILED(hResult)) {
        MessageBox(NULL, DXGetErrorDescription(hResult), TEXT("CreateDXGIFactory1"), MB_OK);
        return 0;
    }

    // Use the first adapter
    IDXGIAdapter1 *pAdapter;

    hResult = pDXGIFactory->EnumAdapters1(0, &pAdapter);
    if (FAILED(hResult)) {
        MessageBox(NULL, DXGetErrorDescription(hResult), "EnumAdapters1", MB_OK);
        return 0;
    }

    pDXGIFactory->Release();

    DXGI_SWAP_CHAIN_DESC scd;
    ZeroMemory(&scd, sizeof(scd));
    scd.BufferDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM; // BGR type specified in the docs
    scd.SampleDesc.Count = 1;
    scd.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;
    scd.BufferCount = 1;
    scd.OutputWindow = g_hWnd;
    scd.SwapEffect = DXGI_SWAP_EFFECT_DISCARD;
    scd.Windowed = TRUE;
    scd.Flags = DXGI_SWAP_CHAIN_FLAG_GDI_COMPATIBLE;

    hResult = D3D11CreateDeviceAndSwapChain(
        pAdapter,
        D3D_DRIVER_TYPE_UNKNOWN,
        NULL,
        D3D11_CREATE_DEVICE_DEBUG |
        D3D11_CREATE_DEVICE_BGRA_SUPPORT |
        D3D11_CREATE_DEVICE_SINGLETHREADED,
        NULL,
        0,
        D3D11_SDK_VERSION,
        &scd,
        &g_pSwapChain,
        &g_pd3dDevice,
        NULL,
        &g_pImmediateContext
        );

I create the rendertargetview from the backbuffer texture:


    // Create a render target view
    ID3D11Texture2D* pBackBuffer = NULL;
    hr = g_pSwapChain->GetBuffer(0, __uuidof(ID3D11Texture2D), (LPVOID*)&pBackBuffer);
    if (FAILED(hr))
        return hr;
    hr = g_pd3dDevice->CreateRenderTargetView(pBackBuffer, NULL, &g_pRenderTargetView);
    pBackBuffer->Release();
    if (FAILED(hr))
    {
        MessageBox(NULL, DXGetErrorDescription(hr), "Failed to create render target view", MB_OK);
        return hr;
    }

Here's the render routine:


    HDC hdc;
    IDXGISurface1* g_pSurface1 = NULL;
    HRESULT hr = g_pSwapChain->GetBuffer(0, __uuidof(IDXGISurface1), (void**)&g_pSurface1);

    if (SUCCEEDED(hr))
    {
        hr = g_pSurface1->GetDC(FALSE, &hdc); // <------------- Error occurs here -----------
        if (SUCCEEDED(hr))
        {
            TextOut(hdc, 5, 5, "Hello", 5);
            g_pSurface1->ReleaseDC(NULL);
        }
        SafeRelease(g_pSurface1);
    }
    g_pImmediateContext->OMSetRenderTargets(1, &g_pRenderTargetView, g_pDepthStencilView);

Thanks for any help.

EDIT: Just to clarify, if I comment out the render routine above, the app works fines (but without the TextOut, of course).

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Advertisement

<code snipped for brevity>

Thanks for any help.

EDIT: Just to clarify, if I comment out the render routine above, the app works fines (but without the TextOut, of course).

looking through msdn, yields this which states that this enum is used in D3D10_BUFFER_DESC, and browsing that page says that it falls under a misc flag.


scd.BufferDesc.MiscFlags = D3D10_RESOURCE_MISC_GDI_COMPATIBLE ;

should do the trick, according to the docs, but i haven't tried it out.

I suspect that the creation of the surface will fail with that flag active though. Try it by all means but last I recall resources meant for shader binding don't play nice with GDI+.

I appreciate the comments. Thanks all.

After much angst, I finally fixed my code. Although I created the swapchain with DXGI_SWAP_CHAIN_FLOAT_GDI_COMPATIBLE, I then called**:


hr = g_pSwapChain->ResizeBuffers(0, 0, 0, DXGI_FORMAT_UNKNOWN, 0); //<------- DID NOT INCLUDE PROPER SWAPCHAIN FLAGS !

instead of:


hr = g_pSwapChain->ResizeBuffers(0, 0, 0, DXGI_FORMAT_UNKNOWN, DXGI_SWAP_CHAIN_FLAG_GDI_COMPATIBLE);

** Actually the ResizeBuffers call is in a UpdateContext() routine which takes care of the stuff that needs to be done on window resizing, the parallel to DX9 OnDeviceReset() sort of thing. I use that routine as well to reset the targetview, stencil/depth buffers, update the projection and viewport, etc. The error occurred because it was in a routine outside of my InitDevice(), and I didn't recognize the "0" in the ResizeBuffers call while I was debugging.

With my Render() routine using:


    HDC hdc;
    IDXGISurface1* g_pSurface1 = NULL;
    HRESULT hr = g_pSwapChain->GetBuffer(0, __uuidof(IDXGISurface1), (void**)&g_pSurface1);

    if (SUCCEEDED(hr))
    {
        hr = g_pSurface1->GetDC(FALSE, &hdc);
        if (SUCCEEDED(hr))
        {
            TextOut(hdc, 50, 50, "GDI Rendering Works!", 20);
            g_pSurface1->ReleaseDC(NULL);
        }
        SafeRelease(g_pSurface1);
    }
    g_pImmediateContext->OMSetRenderTargets(1, &g_pRenderTargetView, g_pDepthStencilView);

[attachment=21406:D11 engine 14.png]

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

I suspect that the creation of the surface will fail with that flag active though. Try it by all means but last I recall resources meant for shader binding don't play nice with GDI+.

I would assume that DXGI_SWAP_CHAIN_FLAG_GDI_COMPATIBLE would actually force the backbuffer(and others if not double buffering) to be created with the D3D10_RESOURCE_MISC_GDI_COMPATIBLE in the buffer description, internally.

@BuckEye: glad you found the bug, how did you come by it? That section wasn't part of the original code you posted, did the GDI text work before resizing the window/buffer or was this something that was called before you had actually presented the first frame?


I would assume that DXGI_SWAP_CHAIN_FLAG_GDI_COMPATIBLE would actually force the backbuffer(and others if not double buffering) to be created with the D3D10_RESOURCE_MISC_GDI_COMPATIBLE in the buffer description, internally.

Dunno for sure. The backbuffer description has the swapchain flag, and not the D3D10 flag, and they're not the same enumerated values. Also, at one point in my trials, I managed to get the error changed to something complaining that I wasn't using the swapchain flag (rather than the D3D10 flag).


glad you found the bug, how did you come by it? That section wasn't part of the original code you posted, did the GDI text work before resizing the window/buffer or was this something that was called before you had actually presented the first frame?

I'm glad, too! Thanks.

That section, indeed, wasn't in my posted code. Might've helped if I had done that, huh?1 I had ass-u-me-d my resizing was okay. Found the problem by paying attention to the debug output which kept insisting my swapchain wasn't GDI compatible. I finally checked the buffer description in my UpdateContext routine, and there it was, bigger than life - the flag wasn't set.

The GDI text didn't work because that resize call is also used during my initialization. Most of the calls needed for resizing the buffers are the same as for initialization (target views, depth/stencil buffers, etc.) so I have just a single routine.

That actually works well for me, as I'm reusing a bunch of GDI stuff. I can now set the viewport to reserve areas for GDI stuff. Without that, I'd have to move D3D11 to a child window, and do bunches of bookkeeping. As it is now, I can mix D3D11 and GDI in the backbuffer!

That pix is a frame capture, by the way. While I was mucking about, I managed to convert the rendertexture to a bmp file (converted to png for uploading). The frame in the lower part of the buffer is a quick addition for testing - GDI call DrawEdge(...).

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

This topic is closed to new replies.

Advertisement