Direct2D fails when drawing a single-channel bitmap

Recommended Posts

I'm an experienced programmer specialized in Computer Graphics, mainly using Direct3D 9.0c, OpenGL and general algorithms. Currently, I am evaluating Direct2D as rendering technology for a professional application dealing with medical image data. As for rendering, it is a x64 desktop application in windowed mode (not fullscreen).

 

Already with my very initial steps I struggle with a task I thought would be a no-brainer: Rendering a single-channel bitmap on screen.

 

Running on a Windows 8.1 machine, I create an ID2D1DeviceContext with a Direct3D swap chain buffer surface as render target. The swap chain is created from a HWND and buffer format DXGI_FORMAT_B8G8R8A8_UNORM. Note: See also the code snippets at the end.

 

Afterwards, I create a bitmap with pixel format DXGI_FORMAT_R8_UNORM and alpha mode D2d1_ALPHA_MODE_IGNORE. When calling DrawBitmap(...) on the device context, a debug break point is triggered with the debug message "D2d DEBUG ERROR - This operation is not compatible with the pixel format of the bitmap".

 

I know that this output is quite clear. Also, when changing the pixel format to DXGI_FORMAT_R8G8B8A8_UNORM with DXGI_ALPHA_MODE_IGNORE everything works well and I see the bitmap rendered. However, I simply cannot believe that! Graphics cards support single-channel textures ever since - every 3D graphics application can use them without thinking twice. This goes without speaking.

 

I tried to find anything here and at Google, without success. The only hint I could find was the MSDN Direct2D page with the (supported pixel formats). The documentation suggests - by not mentioning it - that DXGI_FORMAT_R8_UNORM is indeed not supported as bitmap format. I also find posts talking about alpha masks (using DXGI_FORMAT_A8_UNORM), but that's not what I'm after.


What am I missing that I can't convince Direct2D to create and draw a grayscale bitmap? Or is it really true that Direct2D doesn't support drawing of R8 or R16 bitmaps??

 

Any help is really appreciated as I don't know how to solve this. If I can't get this trivial basics to work, I think I'd have to stop digging deeper into Direct2D :-(.

 

And here is the code snippets of relevance. Please note that they might not compile since I ported this on the fly from my C++/CLI code to plain C++. Also, I threw away all error checking and other noise:

 

Device, Device Context and Swap Chain Creation (D3D and Direct2D):

// Direct2D factory creation  
D2D1_FACTORY_OPTIONS options = {};  
options.debugLevel = D2D1_DEBUG_LEVEL_INFORMATION;  
ID2D1Factory1* d2dFactory;  
D2D1CreateFactory(D2D1_FACTORY_TYPE_MULTI_THREADED, options, &d2dFactory);  
 
// Direct3D device creation  
const auto type = D3D_DRIVER_TYPE_HARDWARE;  
const auto flags = D3D11_CREATE_DEVICE_BGRA_SUPPORT;  
ID3D11Device* d3dDevice;  
D3D11CreateDevice(nullptr, type, nullptr, flags, nullptr, 0, D3D11_SDK_VERSION, &d3dDevice, nullptr, nullptr);  
 
// Direct2D device creation  
IDXGIDevice* dxgiDevice;  
d3dDevice->QueryInterface(__uuidof(IDXGIDevice), reinterpret_cast<void**>(&dxgiDevice));  
ID2D1Device* d2dDevice;  
d2dFactory->CreateDevice(dxgiDevice, &d2dDevice);  
 
// Swap chain creation  
DXGI_SWAP_CHAIN_DESC1 desc = {};  
desc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;  
desc.SampleDesc.Count = 1;  
desc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;  
desc.BufferCount = 2;  
 
IDXGIAdapter* dxgiAdapter;  
dxgiDevice->GetAdapter(&dxgiAdapter);  
IDXGIFactory2* dxgiFactory;  
dxgiAdapter->GetParent(__uuidof(IDXGIFactory), reinterpret_cast<void **>(&dxgiFactory));  
 
IDXGISwapChain1* swapChain;  
dxgiFactory->CreateSwapChainForHwnd(d3dDevice, hwnd, &swapChainDesc, nullptr, nullptr, &swapChain);  
 
// Direct2D device context creation  
const auto options = D2D1_DEVICE_CONTEXT_OPTIONS_NONE;  
ID2D1DeviceContext* deviceContext;  
d2dDevice->CreateDeviceContext(options, &deviceContext);  
 
// create render target bitmap from swap chain  
IDXGISurface* swapChainSurface;  
swapChain->GetBuffer(0, __uuidof(swapChainSurface), reinterpret_cast<void **>(&swapChainSurface));  
D2D1_BITMAP_PROPERTIES1 bitmapProperties;  
bitmapProperties.dpiX = 0.0f;  
bitmapProperties.dpiY = 0.0f;  
bitmapProperties.bitmapOptions = D2D1_BITMAP_OPTIONS_TARGET | D2D1_BITMAP_OPTIONS_CANNOT_DRAW;  
bitmapProperties.pixelFormat.format = DXGI_FORMAT_B8G8R8A8_UNORM;  
bitmapProperties.pixelFormat.alphaMode = D2D1_ALPHA_MODE_IGNORE;  
bitmapProperties.colorContext = nullptr;  
ID2D1Bitmap1* swapChainBitmap = nullptr;  
deviceContext->CreateBitmapFromDxgiSurface(swapChainSurface, &bitmapProperties, &swapChainBitmap);  
 
 
// set swap chain bitmap as render target of D2D device context  
deviceContext->SetTarget(swapChainBitmap);  

 

D2D single-channel Bitmap Creation:

const D2D1_SIZE_U size = { 512, 512 };  
const UINT32 pitch = 512;  
D2D1_BITMAP_PROPERTIES1 d2dProperties;  
ZeroMemory(&d2dProperties, sizeof(D2D1_BITMAP_PROPERTIES1));  
d2dProperties.pixelFormat.alphaMode = D2D1_ALPHA_MODE_IGNORE;  
d2dProperties.pixelFormat.format = DXGI_FORMAT_R8_UNORM;  
char* sourceData = new char[512*512];  
 
ID2D1Bitmap1* d2dBitmap;  
deviceContext->DeviceContextPointer->CreateBitmap(size, sourceData, pitch, d2dProperties, &d2dBitmap); 

 

Bitmap drawing (FAILING):

deviceContext->BeginDraw();  
D2D1_COLOR_F d2dColor = {};  
deviceContext->Clear(d2dColor);  
 
// THIS LINE FAILS WITH THE DEBUG BREAKPOINT IF SINGLE CHANNELED  
deviceContext->DrawBitmap(bitmap, nullptr, 1.0f, D2D1_INTERPOLATION_MODE_LINEAR, nullptr);    
 
swapChain->Present(1, 0);  
deviceContext->EndDraw();  

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Similar Content

    • By cyberspace009
      Hi!
       
      I managed to have a 2D pixel artist contact me to make some pixel sprite work for me. I am happy but also lost on how to go about setting up a work plan list for this artist.
      Can anyone lend me some tips on how to complete a work plan list?
       
      Thanks!
    • By Jon Bon
      Concept for combat in Meteor Bombardment 2
    • By Jon Bon
      Concept for combat in Meteor Bombardment 2
    • By Redlime
      I am a game development student working on a 2.5D beat 'em up brawler game in Unity as my final year project and I've been working on the ai for the enemies in the game. However, I've been unable to get the enemies to working properly as the enemies would be constantly stuttering and jerking while trying to follow the player. I've created a foreach statement and added some conditions that call for the robots to spread out and stay away from the player if it is not in an engaging enum state. This has been an issue plaguing the ai for a while now, analysing the animator, it seems like the robot's movement is instantaneously stopping and going which might be the reason causing this stuttering. But I'm not sure how to prevent the robot's movement from stopping and going. If anyone knows the reason why this is happening to the robots it would be much appreciated as my project is due soon and I'm still unable to fix such a game breaking bug! I've included a few videos below showing what is happening to my robots and also the stuttering shown inside the animator. I've also included the full script for my robots.
       
      Robots Stuttering
      Robots Stuttering Animator
      Robot.cs
    • By mister345
      Hi, can somebody please tell me in clear simple steps how to debug and step through an hlsl shader file?
      I already did Debug > Start Graphics Debugging > then captured some frames from Visual Studio and
      double clicked on the frame to open it, but no idea where to go from there.
       
      I've been searching for hours and there's no information on this, not even on the Microsoft Website!
      They say "open the  Graphics Pixel History window" but there is no such window!
      Then they say, in the "Pipeline Stages choose Start Debugging"  but the Start Debugging option is nowhere to be found in the whole interface.
      Also, how do I even open the hlsl file that I want to set a break point in from inside the Graphics Debugger?
       
      All I want to do is set a break point in a specific hlsl file, step thru it, and see the data, but this is so unbelievably complicated
      and Microsoft's instructions are horrible! Somebody please, please help.
       
       
       

  • Popular Now