Strange Direct2D issue (UNSUPPORTED_PIXEL_FORMAT)

Started by
7 comments, last by ta0soft 10 years, 1 month ago

I'm experiencing a strange issue with Direct2D, I figured someone here might be able to help since you guys always help with my Direct3D questions smile.png

I'm trying to draw some ID2D1Bitmaps to my window using SharpDX and a WindowRenderTarget object.

The bitmaps are first loaded into a SharpDX.Direct2D1.Bitmap object, then drawn to the screen using myRenderTarget.DrawBitmap.

The code runs fine on my system and other Nvidia systems, but I recently tested it using an ATI device and I'm getting a D2DERR_UNSUPPORTED_PIXEL_FORMAT error.

I found this post on another site that was very helpful:

"In order to solve the issue do this:

1) For Windows 7 Direct2D 1.0 (Direct2D before Platform Update) put m_pRenderTarget->BeginDraw() before creating any bitmaps;

2) For Windows 8 Direct2D 1.1 (or for Windows 7 with Platform Update) put m_pRenderTarget->SetTarget (in case you have ID2D1DeviceContext as a type for m_pRenderTarget), or the same m_pRenderTarget->BeginDraw() (in case you have ID2D1HwndRenderTarget as a type for m_pRenderTarget);

The problem is that right after you created m_pRenderTarget in fact this render target does not have a DXGI surface to draw at. And because of it the render target unable to find a pixel format for new bitmaps. That's why you got WINCODEC_ERR_UNSUPPORTEDPIXELFORMAT."

My issue is I can't call BeginDraw before I create my bitmaps because they are created before the window is ever drawn. So instead of using a WindowRenderTarget I tried using Direct3D 10 for the backend like the samples included with SharpDX, and calling myDevice.OutputMerger.SetTargets right after the device is created.

This seems to work except for one problem. The render target and back-buffer must be recreated every time the window is resized and in conjunction my bitmaps must be recreated too. This results in a very noticeable lag when resizing the window. With a WindowRenderTarget I can just call myRenderTarget.Resize without having to recreate anything so I would prefer to use a WindowRenderTarget instead.

Does anyone know of a solution that allows me to use a WindowRenderTarget and create my bitmaps without calling BeginDraw? If not, is it possible to resize the D3D swap chain without having to reset the render target? Any help would be greatly appreciated.

Advertisement

Here's some code to reproduce the error...

Option #1 (WindowRenderTarget):

[source]

// Creating the render target

RenderTargetProperties properties = new RenderTargetProperties(new PixelFormat(Format.R8G8B8A8_UNorm, AlphaMode.Premultiplied));

HwndRenderTargetProperties hWndProperties = new HwndRenderTargetProperties();

hWndProperties.Hwnd = p_Owner.Handle;

hWndProperties.PixelSize = new DrawingSize(p_Owner.Width, p_Owner.Height);

hWndProperties.PresentOptions = PresentOptions.None;

p_FactoryD2D = new Factory(FactoryType.SingleThreaded);

p_FactoryDWrite = new DirectWrite.Factory(DirectWrite.FactoryType.Isolated);

p_RenderTarget = new WindowRenderTarget(p_FactoryD2D, properties, hWndProperties);

// Resizing the render target

p_RenderTarget.Resize(new DrawingSize(p_Owner.Width, p_Owner.Height));

// Drawing the bitmap

p_RenderTarget.DrawBitmap(myBitmap, new RectangleF(r.Left, r.Top, r.Right, r.Bottom), alpha, BitmapInterpolationMode.Linear);

[/source]

Option #2 (Direct3D 10):

[source]

// Creating the render target

SwapChainDescription desc = new SwapChainDescription();

desc.BufferCount = 2;

desc.Flags = SwapChainFlags.None;

desc.IsWindowed = true;

desc.ModeDescription = new ModeDescription(0, 0, new Rational(60, 1), Format.R8G8B8A8_UNorm);

desc.ModeDescription.Scaling = DisplayModeScaling.Unspecified;

desc.OutputHandle = p_Owner.Handle;

desc.SampleDescription = new SampleDescription(1, 0);

desc.SwapEffect = SwapEffect.Sequential;

desc.Usage = Usage.RenderTargetOutput;

Device1.CreateWithSwapChain(DriverType.Hardware, DeviceCreationFlags.BgraSupport, desc, FeatureLevel.Level_10_0, out p_Device, out p_SwapChain);

DXGI.Factory factory = p_SwapChain.GetParent<DXGI.Factory>();

factory.MakeWindowAssociation(p_Owner.Handle, WindowAssociationFlags.IgnoreAll);

p_BackBuffer = Texture2D.FromSwapChain<Texture2D>(p_SwapChain, 0);

p_BackBufferView = new RenderTargetView(p_Device, p_BackBuffer);

p_FactoryD2D = new Factory(FactoryType.SingleThreaded);

p_FactoryDWrite = new DirectWrite.Factory(DirectWrite.FactoryType.Isolated);

using (Surface surface = p_BackBuffer.QueryInterface<Surface>())

{

p_RenderTarget = new RenderTarget(p_Factory2D, surface, new RenderTargetProperties(new PixelFormat(Format.Unknown, AlphaMode.Premultiplied)));

}

p_Device.OutputMerger.SetTargets(p_BackBufferView);

// Resizing the render target (Requires reset)

p_SwapChain.ResizeBuffers(p_SwapChain.Description.BufferCount, p_Owner.Width, p_Owner.Height, p_SwapChain.Description.ModeDescription.Format, p_SwapChain.Description.Flags);

DXGI.Factory factory = p_SwapChain.GetParent<DXGI.Factory>();

factory.MakeWindowAssociation(p_Owner.Handle, WindowAssociationFlags.IgnoreAll);

p_BackBuffer = Texture2D.FromSwapChain<Texture2D>(p_SwapChain, 0);

p_BackBufferView = new RenderTargetView(p_Device, p_BackBuffer);

p_FactoryD2D = new Factory(FactoryType.SingleThreaded);

p_FactoryDWrite = new DirectWrite.Factory(DirectWrite.FactoryType.Isolated);

using (Surface surface = p_BackBuffer.QueryInterface<Surface>())

{

p_RenderTarget = new RenderTarget(p_Factory2D, surface, new RenderTargetProperties(new PixelFormat(Format.Unknown, AlphaMode.Premultiplied)));

}

p_Device.OutputMerger.SetTargets(p_BackBufferView);

// Drawing the bitmap

p_RenderTarget.DrawBitmap(myBitmap, new RectangleF(r.Left, r.Top, r.Right, r.Bottom), alpha, BitmapInterpolationMode.Linear);

[/source]

Option 1 obviously uses less code and doesn't require resetting the render target when the window is resized which makes it the preferred option. I just can't figure out how to fix this UNSUPPORTED_PIXEL_FORMAT error, it only happens on a single pc I can't reproduce it on my pc or anywhere else.

What I eventually did is create a MemoryBitmap wrapper class that stores both a System.Drawing.Bitmap and a SharpDX.Direct2D1.Bitmap, and a reference to my render target. The SharpDX bitmap remains null until the first time it is drawn or when the render target is reset (ensuring that BeginDraw is called first), then it is converted from the managed bitmap.

Modified code:

[source]

// Drawing the bitmap

if (myBitmap.RenderTarget == null || !p_RenderTarget.Equals(myBitmap.RenderTarget)) myBitmap.Reset(p_RenderTarget);

if (myBitmap.NativeBitmap != null) p_RenderTarget.DrawBitmap(myBitmap.NativeBitmap, new RectangleF(r.Left, r.Top, r.Right, r.Bottom), alpha, BitmapInterpolationMode.Linear);

// Reset function

void Reset(RenderTarget renderTarget)

{

if (p_NativeBitmap != null)

{

p_NativeBitmap.Dispose();

p_NativeBitmap = null;

}

p_RenderTarget = renderTarget;

p_NativeBitmap = ConvertBitmap(p_RenderTarget, p_ManagedBitmap);

}

[/source]

Again, this runs fine on my system but on the ATI system now I'm getting UNSUPPORTED_PIXEL_FORMAT when creating my WindowRenderTarget. I'm so confused, what am I doing wrong? sad.png

WINCODEC_ERR_UNSUPPORTEDPIXELFORMAT is different from D2DERR_UNSUPPORTED_PIXEL_FORMAT.

My guess is that SlimDX is giving your Bitmap the unsupported pixel format. If you can, you should request a specific format from it - most of the D2D samples from MSDN use RGBA.

I'm using SharpDX, I guess I should've mentioned that in the OP title sorry wacko.png I just figured they were the same error because the post was in c++ but it dealt with the same issue I'm having.

Here's the post: http://social.msdn.microsoft.com/Forums/windowsdesktop/en-US/416004c5-6b73-45cd-a820-92b6608e9bfd/id2d1rendertargetcreatebitmap-returns-0x88982f80?forum=windowssdk

I'm creating my bitmaps with the same pixel format as my render target (R8G8B8A8_UNorm, Premultiplied), and I'm using the same code from the SharpDX samples to convert a System.Drawing.Bitmap:

[source]

public static SharpDX.Direct2D1.Bitmap ConvertBitmap(RenderTarget renderTarget, Bitmap b)

{

BitmapProperties bitmapProperties = new BitmapProperties(new PixelFormat(Format.R8G8B8A8_UNorm, AlphaMode.Premultiplied));

// Transform pixels from BGRA to RGBA
int stride = b.Width * sizeof(int);

using (SharpDX.DataStream tempStream = new SharpDX.DataStream(b.Height * stride, false, true))

{
BitmapData bitmapData = b.LockBits(new Rectangle(0, 0, b.Width, b.Height), ImageLockMode.ReadOnly, PixelFormat.Format32bppPArgb);

// Convert all pixels
for (int y = 0; y < b.Height; y++)
{
int offset = bitmapData.Stride * y;
for (int x = 0; x < b.Width; x++)
{
// Not optimized
byte B = Marshal.ReadByte(bitmapData.Scan0, offset++);
byte G = Marshal.ReadByte(bitmapData.Scan0, offset++);
byte R = Marshal.ReadByte(bitmapData.Scan0, offset++);
byte A = Marshal.ReadByte(bitmapData.Scan0, offset++);
int rgba = R | (G << 8) | (B << 16) | (A << 24);
tempStream.Write(rgba);
}
}
b.UnlockBits(bitmapData);
tempStream.Position = 0;

return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(b.Width, b.Height), tempStream, stride, bitmapProperties);
}
}

[/source]

I'm only stumped because the code runs perfectly in most situations. I can only reproduce the error on a single pc with ATI hardware (6970) installed,

and their card runs D2D/D3D fine its not a compatibility issue as far as I can tell. I only get the error when I try to draw bitmaps to a WindowRenderTarget.

Well I figured out the cause of the problem but I'm unsure how to fix it entirely. It seems WindowRenderTarget just isn't compatible with R8G8B8A8_UNorm pixel format on all systems. Using Format.B8G8R8A8_UNorm works without errors, but now I'm having problems converting the alpha component of my System.Drawing.Bitmap over to my SharpDX bitmap. The RGB values transfer over but the alpha value is always 100% even if my bitmap has transparency, but I guess that's a different topic.

[source]

BitmapProperties bitmapProperties = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied));

int stride = bmp.Width * sizeof(int);
using (SharpDX.DataStream tempStream = new SharpDX.DataStream(bmp.Height * stride, false, true))
{
BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, bmp.PixelFormat);
int numBytes = bmpData.Stride * bmp.Height;
byte[] byteData = new byte[numBytes];
IntPtr ptr = bmpData.Scan0;

Marshal.Copy(ptr, byteData, 0, numBytes);

bmp.UnlockBits(bmpData);

tempStream.Write(byteData, 0, numBytes);
tempStream.Position = 0;
return new Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bitmapProperties);
}

[/source]

I mistook SlimDX for SharpDX. Sorry. (i've really never used SharpDX or SlimDX, to be fair).

Anyway, in the first WindowRenderTarget example you were directly drawing the myBitmap object to the WindowRendertarget. In this last example, you're showing some extra loading code that seems to be doing a direct memory copy of bitmap bits from one Bitmap to another? Also, the B8G8R8A8_UNorm format is recommended required only for WindowRenderTarget, but in this last code sample, you're also using it for the Bitmap. You should make sure that your source Bitmap (bmp) does also have the B8G8R8A8_UNorm and pre-multiplied alpha format, since you are simply copying it's bits to a new bitmap of this format).

Have you tried using B8G8R8A8_UNorm for the WindowRendertarget and drawing the source bitmap (bmp) directly, with whatever format it originally has, or is that not a SlimDX Bitmap?

...Or am I missing something else here? smile.png

Also, checkout this MSDN article: http://msdn.microsoft.com/en-us/library/windows/desktop/dd756766%28v=vs.85%29.aspx

Thanks for the response, they are both almost identical just wrappers around the DirectX APIs so I don't think there is a difference regarding bitmaps.

After doing my MSDN homework I learned that WindowRenderTargets created with RenderTargetType.Software only support B8G8R8A8_UNORM or DXGI_FORMAT_UNKNOWN. If the target is created with RenderTargetType.Hardware it will also support R8G8B8A8_UNORM. That was the problem all along but now I've come across another issue.

I'm storing my bitmaps in a System.Drawing.Bitmap first then converting them to D2D bitmaps for 2 reasons. One is the D2D bitmap must remain null until BeginDraw() is called, so I convert the bitmaps right before they are drawn for the first time. The other is because I'm also adding GDI+ support to my application for clients that don't support D2D.

To the best of my knowledge my System.Drawing.Bitmaps are PixelFormat.Format32bppPArgb, but apparently they are stored in BGRA format. All of the SharpDX samples convert the BGRA values to RGBA before creating the D2D bitmaps. To keep this same format I need to use B8G8R8A8_UNORM and ALPHA_MODE_PREMULTIPLIED. The problem is the alpha values are getting lost somewhere during conversion. The RGB channels seem to convert fine but the bitmap always has 100% alpha even when the original image is transparent.

I don't have this problem using RGBA the alpha channels are converted perfectly, but when I use BGRA it doesn't work. My guess is because the alpha values are pre-multiplied but I'm not sure how to convert them correctly. I wrote 3 different conversion functions but they all work the same. The alpha values are not transferring over. The first function is the most efficient because it doesn't require locking the System.Drawing.Bitmap. The last function confuses me the most because it copies the bits directly and it still doesn't work.

[source]
BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied));
int stride = bmp.Width * sizeof(int);

using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true))
{
for (int y = 0; y < bmp.Height; y++)
{
for (int x = 0; x < bmp.Width; x++)
{
Color c = bmp.GetPixel(x, y);
int bgra = c.B | (c.G << 8) | (c.R << 16) | (c.A << 24);
tempStream.Write(bgra);
}
}

return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp);
}
[/source]

[source]
BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied));
int stride = bmp.Width * sizeof(int);

using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true))
{
BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, bmp.PixelFormat);

for (int y = 0; y < bmp.Height; y++)
{
int offset = bmpData.Stride * y;

for (int x = 0; x < bmp.Width; x++)
{
byte b = Marshal.ReadByte(bmpData.Scan0, offset++);
byte g = Marshal.ReadByte(bmpData.Scan0, offset++);
byte r = Marshal.ReadByte(bmpData.Scan0, offset++);
byte a = Marshal.ReadByte(bmpData.Scan0, offset++);
int bgra = b | (g << 8) | (r << 16) | (a << 24);

tempStream.Write(bgra);
}
}

bmp.UnlockBits(bmpData);
return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp);
}
[/source]

[source]
BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied));
int stride = bmp.Width * sizeof(int);

using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true))
{
BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, bmp.PixelFormat);
int numBytes = bmpData.Stride * bmp.Height;
byte[] byteData = new byte[numBytes];
IntPtr ptr = bmpData.Scan0;

Marshal.Copy(ptr, byteData, 0, numBytes);
bmp.UnlockBits(bmpData);
tempStream.Write(byteData, 0, numBytes);

return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp);
}
[/source]

I created a separate thread regarding this issue because it's a different topic and it is not really related to the OP title.

http://www.gamedev.net/topic/654164-converting-systemdrawingbitmap-to-idirect2d1bitmap/

This topic is closed to new replies.

Advertisement