ta0soft

Members
  • Content count

    35
  • Joined

  • Last visited

Community Reputation

127 Neutral

About ta0soft

  • Rank
    Member
  1. Light position/rotation

    So I'm a complete noob when it comes to OpenGL and GLSL but I have experience with DirectX. I'm developing a simple 3D application with a rotating sphere. The problem I'm facing is the rotation effects the position of the light, I'd like the light position to remain fixed while the sphere rotates. I know this is a shader issue because the view matrix and light position are stored in uniform values. I tried creating a new uniform and multiplying by that instead of viewMatrix, I also tried resetting the view matrix to identity state before/after drawing the sphere but neither solution worked. I'm using Juce framework to develop the application but I don't think that makes a difference. Can someone point me in the right direction? Tom if (p_Uniforms->projectionMatrix != nullptr) p_Uniforms->projectionMatrix->setMatrix4(CreateProjectionMatrix().mat, 1, false); if (p_Uniforms->texture != nullptr) p_Uniforms->texture->set((GLint)0); if (p_Uniforms->lightPosition != nullptr) p_Uniforms->lightPosition->set(-15.0f, 10.0f, 15.0f, 0.0f); if (p_Uniforms->viewMatrix != nullptr) p_Uniforms->viewMatrix->setMatrix4(CreateViewMatrix().mat, 1, false); attribute vec4 position; attribute vec4 normal; attribute vec4 sourceColour; attribute vec2 texureCoordIn; uniform mat4 projectionMatrix; uniform mat4 viewMatrix; uniform vec4 lightPosition; varying vec4 destinationColour; varying vec2 textureCoordOut; varying float lightIntensity; void main() { destinationColour = sourceColour; textureCoordOut = texureCoordIn; vec4 light = viewMatrix * lightPosition; lightIntensity = dot(light, normal); gl_Position = projectionMatrix * viewMatrix * position; } varying vec4 destinationColour; varying vec2 textureCoordOut; varying float lightIntensity; uniform sampler2D demoTexture; void main() { float l = max(0.3, lightIntensity * 0.3); vec4 colour = vec4(l, l, l, 1.0); gl_FragColor = colour * texture2D(demoTexture, textureCoordOut); }
  2. There's 2 ways you can accomplish this.   You can use the Control.PointToClient() function, this will convert the screen coordinates that you specify into client coordinates. For example if your PictureBox is placed at X=10, Y=10 inside the Form, and the mouse location is X=20, Y=20, myPictureBox.PointToClient(new Point(20, 20)) will return X=10, Y=10.   Another option is by computing the coordinates yourself. This is basically what PointToClient() does behind the scenes. Point clientLocation = new Point(mousePosition.X - myPictureBox.Left, mousePosition.Y - myPictureBox.Top);   You can also use the the built-in mouse event handlers for controls (MouseDown, MouseMove, MouseUp, MouseLeave, MouseWheel). These events will automatically convert the mouse position to client coordinates, so you don't have to convert them yourself.
  3. I finally figured it out thanks to good-old MSDN, I guess I should do more homework before asking these types of questions   "To make a composite image in the standard RGBA format, the alpha value of the foreground image must be multiplied by each of the red, green, and blue channels before adding it to the color of the background image. In a pre-multiplied alpha RGB pixel format, each color channel has already been multiplied by the alpha value. This provides a more efficient method of image composition with alpha-channel data. To retrieve the true color values of each channel in a PRGBA/PBGRA pixel format, the alpha-channel multiplication must be reversed by dividing color values by the alpha value."   Here's the working code for anyone who might be having the same issue:   [source] BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied)); int stride = bmp.Width * sizeof(int); using (SharpDX.DataStream tempStream = new SharpDX.DataStream(bmp.Height * stride, false, true)) {     for (int y = 0; y < bmp.Height; y++)     {         for (int x = 0; x < bmp.Width; x++)         {             Color c = bmp.GetPixel(x, y);             int a = c.A;             int r = (c.R * a) / 255;             int g = (c.G * a) / 255;             int b = (c.B * a) / 255;             int bgra = b | (g << 8) | (r << 16) | (a << 24);             tempStream.Write(bgra);         }     }     return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp); } [/source]
  4. Any advice? I tried everything I still can't figure it out. Looks like I'm stuck using RGBA for the time being :(
  5. I created a separate thread regarding this issue because it's a different topic and it is not really related to the OP title.  http://www.gamedev.net/topic/654164-converting-systemdrawingbitmap-to-idirect2d1bitmap/
  6. Like the title says I'm trying to convert my System.Drawing.Bitmaps (with alpha) to IDirect2D1Bitmaps in C#, specifically a SharpDX.Direct2D1.Bitmap but I don't think that matters.   To the best of my knowledge GDI+ bitmaps are stored in pre-multiplied BGRA format and I need to keep this same format when converting them to D2D bitmaps, that being DXGI_FORMAT_B8G8R8A8_UNORM, D2D1_ALPHA_MODE_PREMULTIPLIED. All of the samples included with SharpDX convert the BGRA values to ARGB first before creating the bitmaps so I need a different solution.   The reason is because I'm trying to render them inside an aero glass window using Direct2D. The only way to accomplish this is with pre-multiplied BGRA pixel format, and according to Microsoft pre-multiplied BGRA is the most efficient way to draw in Direct2D. I normally just use ARGB format but alpha blending doesn't work correctly on glass windows.   I came up with a few different conversion functions but none of them seem to work properly. The alpha values are getting lost somewhere during conversion. The color channels seem to convert fine but the resulting bitmap always has 100% alpha even when the original image has transparency.   I don't have this problem using RGBA the alpha channels are converted perfectly, but when I use BGRA it doesn't work. My best guess is because the values are already pre-multiplied but I'm not sure how to convert them correctly.   The first function is the most efficient because it doesn't require locking the System.Drawing.Bitmap, but they all do the same thing. The last function confuses me the most because it copies the bits directly and it still doesn't work.   [source] BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied)); int stride = bmp.Width * sizeof(int); using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true)) {     for (int y = 0; y < bmp.Height; y++)     {         for (int x = 0; x < bmp.Width; x++)         {             Color c = bmp.GetPixel(x, y);             int bgra = c.B | (c.G << 8) | (c.R << 16) | (c.A << 24);             tempStream.Write(bgra);         }     }     return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp); } [/source]   [source] BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied)); int stride = bmp.Width * sizeof(int); using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true)) {     BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, bmp.PixelFormat);       for (int y = 0; y < bmp.Height; y++)     {         int offset = bmpData.Stride * y;         for (int x = 0; x < bmp.Width; x++)         {             byte b = Marshal.ReadByte(bmpData.Scan0, offset++);             byte g = Marshal.ReadByte(bmpData.Scan0, offset++);             byte r = Marshal.ReadByte(bmpData.Scan0, offset++);             byte a = Marshal.ReadByte(bmpData.Scan0, offset++);             int bgra = b | (g << 8) | (r << 16) | (a << 24);             tempStream.Write(bgra);         }     }     bmp.UnlockBits(bmpData);     return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp); } [/source]   [source] BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied)); int stride = bmp.Width * sizeof(int); using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true)) {     BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, bmp.PixelFormat);     int numBytes = bmpData.Stride * bmp.Height;     byte[] byteData = new byte[numBytes];     IntPtr ptr = bmpData.Scan0;     Marshal.Copy(ptr, byteData, 0, numBytes);     bmp.UnlockBits(bmpData);     tempStream.Write(byteData, 0, numBytes);     return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp); } [/source]
  7. Well Unity is probably your best option, although one must never forget about p/invoke. Most C++ code can be accessed from .Net with a little extra work but there will also be a slight performance loss. SharpDX uses p/invoke to access the DirectX APIs from .Net, but the performance loss is very minor because it's generated directly from the DirectX SDK header files. C++ CLI might be another option for you, this would allow you to develop your game in C++ and still have the power of .Net framework and the Common Language Infrastructure.   I've asked myself this same question many times. Eventually I ended up developing my own 3D engine for the programming experience. Every 3rd party game engine has it's pros and cons, it's really all about preference. I couldn't find a game engine that suited my needs so I had to write my own.
  8. Thanks for the response, they are both almost identical just wrappers around the DirectX APIs so I don't think there is a difference regarding bitmaps.   After doing my MSDN homework I learned that WindowRenderTargets created with RenderTargetType.Software only support B8G8R8A8_UNORM or DXGI_FORMAT_UNKNOWN. If the target is created with RenderTargetType.Hardware it will also support R8G8B8A8_UNORM. That was the problem all along but now I've come across another issue.   I'm storing my bitmaps in a System.Drawing.Bitmap first then converting them to D2D bitmaps for 2 reasons. One is the D2D bitmap must remain null until BeginDraw() is called, so I convert the bitmaps right before they are drawn for the first time. The other is because I'm also adding GDI+ support to my application for clients that don't support D2D.   To the best of my knowledge my System.Drawing.Bitmaps are PixelFormat.Format32bppPArgb, but apparently they are stored in BGRA format. All of the SharpDX samples convert the BGRA values to RGBA before creating the D2D bitmaps. To keep this same format I need to use B8G8R8A8_UNORM and ALPHA_MODE_PREMULTIPLIED. The problem is the alpha values are getting lost somewhere during conversion. The RGB channels seem to convert fine but the bitmap always has 100% alpha even when the original image is transparent.   I don't have this problem using RGBA the alpha channels are converted perfectly, but when I use BGRA it doesn't work. My guess is because the alpha values are pre-multiplied but I'm not sure how to convert them correctly. I wrote 3 different conversion functions but they all work the same. The alpha values are not transferring over. The first function is the most efficient because it doesn't require locking the System.Drawing.Bitmap. The last function confuses me the most because it copies the bits directly and it still doesn't work.   [source] BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied)); int stride = bmp.Width * sizeof(int); using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true)) {     for (int y = 0; y < bmp.Height; y++)     {         for (int x = 0; x < bmp.Width; x++)         {             Color c = bmp.GetPixel(x, y);             int bgra = c.B | (c.G << 8) | (c.R << 16) | (c.A << 24);             tempStream.Write(bgra);         }     }        return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp); } [/source] [source] BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied)); int stride = bmp.Width * sizeof(int); using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true)) {     BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, bmp.PixelFormat);        for (int y = 0; y < bmp.Height; y++)     {         int offset = bmpData.Stride * y;                for (int x = 0; x < bmp.Width; x++)         {             byte b = Marshal.ReadByte(bmpData.Scan0, offset++);             byte g = Marshal.ReadByte(bmpData.Scan0, offset++);             byte r = Marshal.ReadByte(bmpData.Scan0, offset++);             byte a = Marshal.ReadByte(bmpData.Scan0, offset++);             int bgra = b | (g << 8) | (r << 16) | (a << 24);                        tempStream.Write(bgra);         }     }        bmp.UnlockBits(bmpData);     return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp); } [/source] [source] BitmapProperties bp = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied)); int stride = bmp.Width * sizeof(int); using (DataStream tempStream = new DataStream(bmp.Height * stride, false, true)) {     BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, bmp.PixelFormat);     int numBytes = bmpData.Stride * bmp.Height;     byte[] byteData = new byte[numBytes];     IntPtr ptr = bmpData.Scan0;        Marshal.Copy(ptr, byteData, 0, numBytes);     bmp.UnlockBits(bmpData);     tempStream.Write(byteData, 0, numBytes);        return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bp); } [/source]
  9. Well I figured out the cause of the problem but I'm unsure how to fix it entirely. It seems WindowRenderTarget just isn't compatible with R8G8B8A8_UNorm pixel format on all systems. Using Format.B8G8R8A8_UNorm works without errors, but now I'm having problems converting the alpha component of my System.Drawing.Bitmap over to my SharpDX bitmap. The RGB values transfer over but the alpha value is always 100% even if my bitmap has transparency, but I guess that's a different topic.   [source] BitmapProperties bitmapProperties = new BitmapProperties(new PixelFormat(Format.B8G8R8A8_UNorm, AlphaMode.Premultiplied)); int stride = bmp.Width * sizeof(int); using (SharpDX.DataStream tempStream = new SharpDX.DataStream(bmp.Height * stride, false, true)) {     BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, bmp.PixelFormat);     int numBytes = bmpData.Stride * bmp.Height;     byte[] byteData = new byte[numBytes];     IntPtr ptr = bmpData.Scan0;     Marshal.Copy(ptr, byteData, 0, numBytes);     bmp.UnlockBits(bmpData);     tempStream.Write(byteData, 0, numBytes);     tempStream.Position = 0;     return new Bitmap(renderTarget, new DrawingSize(bmp.Width, bmp.Height), tempStream, stride, bitmapProperties); } [/source]
  10. I'm using SharpDX, I guess I should've mentioned that in the OP title sorry  I just figured they were the same error because the post was in c++ but it dealt with the same issue I'm having. Here's the post: http://social.msdn.microsoft.com/Forums/windowsdesktop/en-US/416004c5-6b73-45cd-a820-92b6608e9bfd/id2d1rendertargetcreatebitmap-returns-0x88982f80?forum=windowssdk   I'm creating my bitmaps with the same pixel format as my render target (R8G8B8A8_UNorm, Premultiplied), and I'm using the same code from the SharpDX samples to convert a System.Drawing.Bitmap:   [source] public static SharpDX.Direct2D1.Bitmap ConvertBitmap(RenderTarget renderTarget, Bitmap b) {     BitmapProperties bitmapProperties = new BitmapProperties(new PixelFormat(Format.R8G8B8A8_UNorm, AlphaMode.Premultiplied));     // Transform pixels from BGRA to RGBA     int stride = b.Width * sizeof(int);     using (SharpDX.DataStream tempStream = new SharpDX.DataStream(b.Height * stride, false, true))     {         BitmapData bitmapData = b.LockBits(new Rectangle(0, 0, b.Width, b.Height), ImageLockMode.ReadOnly, PixelFormat.Format32bppPArgb);         // Convert all pixels         for (int y = 0; y < b.Height; y++)         {             int offset = bitmapData.Stride * y;             for (int x = 0; x < b.Width; x++)             {                 // Not optimized                 byte B = Marshal.ReadByte(bitmapData.Scan0, offset++);                 byte G = Marshal.ReadByte(bitmapData.Scan0, offset++);                 byte R = Marshal.ReadByte(bitmapData.Scan0, offset++);                 byte A = Marshal.ReadByte(bitmapData.Scan0, offset++);                 int rgba = R | (G << 8) | (B << 16) | (A << 24);                 tempStream.Write(rgba);             }         }         b.UnlockBits(bitmapData);         tempStream.Position = 0;         return new SharpDX.Direct2D1.Bitmap(renderTarget, new DrawingSize(b.Width, b.Height), tempStream, stride, bitmapProperties);     } } [/source]     I'm only stumped because the code runs perfectly in most situations. I can only reproduce the error on a single pc with ATI hardware (6970) installed, and their card runs D2D/D3D fine its not a compatibility issue as far as I can tell. I only get the error when I try to draw bitmaps to a WindowRenderTarget.
  11. What I eventually did is create a MemoryBitmap wrapper class that stores both a System.Drawing.Bitmap and a SharpDX.Direct2D1.Bitmap, and a reference to my render target. The SharpDX bitmap remains null until the first time it is drawn or when the render target is reset (ensuring that BeginDraw is called first), then it is converted from the managed bitmap.   Modified code: [source] // Drawing the bitmap if (myBitmap.RenderTarget == null || !p_RenderTarget.Equals(myBitmap.RenderTarget)) myBitmap.Reset(p_RenderTarget); if (myBitmap.NativeBitmap != null) p_RenderTarget.DrawBitmap(myBitmap.NativeBitmap, new RectangleF(r.Left, r.Top, r.Right, r.Bottom), alpha, BitmapInterpolationMode.Linear);   // Reset function void Reset(RenderTarget renderTarget) {     if (p_NativeBitmap != null)     {         p_NativeBitmap.Dispose();         p_NativeBitmap = null;     }       p_RenderTarget = renderTarget;     p_NativeBitmap = ConvertBitmap(p_RenderTarget, p_ManagedBitmap); } [/source]   Again, this runs fine on my system but on the ATI system now I'm getting UNSUPPORTED_PIXEL_FORMAT when creating my WindowRenderTarget. I'm so confused, what am I doing wrong?
  12. Here's some code to reproduce the error...   Option #1 (WindowRenderTarget): [source] // Creating the render target RenderTargetProperties properties = new RenderTargetProperties(new PixelFormat(Format.R8G8B8A8_UNorm, AlphaMode.Premultiplied));   HwndRenderTargetProperties hWndProperties = new HwndRenderTargetProperties(); hWndProperties.Hwnd = p_Owner.Handle; hWndProperties.PixelSize = new DrawingSize(p_Owner.Width, p_Owner.Height); hWndProperties.PresentOptions = PresentOptions.None;   p_FactoryD2D = new Factory(FactoryType.SingleThreaded); p_FactoryDWrite = new DirectWrite.Factory(DirectWrite.FactoryType.Isolated); p_RenderTarget = new WindowRenderTarget(p_FactoryD2D, properties, hWndProperties);   // Resizing the render target p_RenderTarget.Resize(new DrawingSize(p_Owner.Width, p_Owner.Height));   // Drawing the bitmap p_RenderTarget.DrawBitmap(myBitmap, new RectangleF(r.Left, r.Top, r.Right, r.Bottom), alpha, BitmapInterpolationMode.Linear);  [/source]   Option #2 (Direct3D 10): [source] // Creating the render target SwapChainDescription desc = new SwapChainDescription(); desc.BufferCount = 2; desc.Flags = SwapChainFlags.None; desc.IsWindowed = true; desc.ModeDescription = new ModeDescription(0, 0, new Rational(60, 1), Format.R8G8B8A8_UNorm); desc.ModeDescription.Scaling = DisplayModeScaling.Unspecified; desc.OutputHandle = p_Owner.Handle; desc.SampleDescription = new SampleDescription(1, 0); desc.SwapEffect = SwapEffect.Sequential; desc.Usage = Usage.RenderTargetOutput;   Device1.CreateWithSwapChain(DriverType.Hardware, DeviceCreationFlags.BgraSupport, desc, FeatureLevel.Level_10_0, out p_Device, out p_SwapChain);   DXGI.Factory factory = p_SwapChain.GetParent<DXGI.Factory>(); factory.MakeWindowAssociation(p_Owner.Handle, WindowAssociationFlags.IgnoreAll);   p_BackBuffer = Texture2D.FromSwapChain<Texture2D>(p_SwapChain, 0); p_BackBufferView = new RenderTargetView(p_Device, p_BackBuffer); p_FactoryD2D = new Factory(FactoryType.SingleThreaded); p_FactoryDWrite = new DirectWrite.Factory(DirectWrite.FactoryType.Isolated);   using (Surface surface = p_BackBuffer.QueryInterface<Surface>()) {     p_RenderTarget = new RenderTarget(p_Factory2D, surface, new RenderTargetProperties(new PixelFormat(Format.Unknown, AlphaMode.Premultiplied))); } p_Device.OutputMerger.SetTargets(p_BackBufferView);   // Resizing the render target (Requires reset) p_SwapChain.ResizeBuffers(p_SwapChain.Description.BufferCount, p_Owner.Width, p_Owner.Height, p_SwapChain.Description.ModeDescription.Format, p_SwapChain.Description.Flags);   DXGI.Factory factory = p_SwapChain.GetParent<DXGI.Factory>(); factory.MakeWindowAssociation(p_Owner.Handle, WindowAssociationFlags.IgnoreAll);   p_BackBuffer = Texture2D.FromSwapChain<Texture2D>(p_SwapChain, 0); p_BackBufferView = new RenderTargetView(p_Device, p_BackBuffer); p_FactoryD2D = new Factory(FactoryType.SingleThreaded); p_FactoryDWrite = new DirectWrite.Factory(DirectWrite.FactoryType.Isolated);   using (Surface surface = p_BackBuffer.QueryInterface<Surface>()) {     p_RenderTarget = new RenderTarget(p_Factory2D, surface, new RenderTargetProperties(new PixelFormat(Format.Unknown, AlphaMode.Premultiplied))); } p_Device.OutputMerger.SetTargets(p_BackBufferView);   // Drawing the bitmap p_RenderTarget.DrawBitmap(myBitmap, new RectangleF(r.Left, r.Top, r.Right, r.Bottom), alpha, BitmapInterpolationMode.Linear);  [/source]   Option 1 obviously uses less code and doesn't require resetting the render target when the window is resized which makes it the preferred option. I just can't figure out how to fix this UNSUPPORTED_PIXEL_FORMAT error, it only happens on a single pc I can't reproduce it on my pc or anywhere else.
  13. Awesome! You're so right there is a lack of good SharpDX tutorials on the web. These tutorials are great I'd like to see more!
  14. I'm experiencing a strange issue with Direct2D, I figured someone here might be able to help since you guys always help with my Direct3D questions   I'm trying to draw some ID2D1Bitmaps to my window using SharpDX and a WindowRenderTarget object. The bitmaps are first loaded into a SharpDX.Direct2D1.Bitmap object, then drawn to the screen using myRenderTarget.DrawBitmap.   The code runs fine on my system and other Nvidia systems, but I recently tested it using an ATI device and I'm getting a D2DERR_UNSUPPORTED_PIXEL_FORMAT error.   I found this post on another site that was very helpful: "In order to solve the issue do this: 1) For Windows 7 Direct2D 1.0 (Direct2D before Platform Update)  put m_pRenderTarget->BeginDraw() before creating any bitmaps; 2) For Windows 8 Direct2D 1.1 (or for Windows 7 with Platform Update) put m_pRenderTarget->SetTarget (in case you have ID2D1DeviceContext as a type for m_pRenderTarget), or the same m_pRenderTarget->BeginDraw() (in case you have ID2D1HwndRenderTarget as a type for m_pRenderTarget); The problem is that right after you created m_pRenderTarget in fact this render target does not have a DXGI surface to draw at. And because of it the render target unable to find a pixel format for new bitmaps. That's why you got WINCODEC_ERR_UNSUPPORTEDPIXELFORMAT."   My issue is I can't call BeginDraw before I create my bitmaps because they are created before the window is ever drawn. So instead of using a WindowRenderTarget I tried using Direct3D 10 for the backend like the samples included with SharpDX, and calling myDevice.OutputMerger.SetTargets right after the device is created. This seems to work except for one problem. The render target and back-buffer must be recreated every time the window is resized and in conjunction my bitmaps must be recreated too. This results in a very noticeable lag when resizing the window. With a WindowRenderTarget I can just call myRenderTarget.Resize without having to recreate anything so I would prefer to use a WindowRenderTarget instead.   Does anyone know of a solution that allows me to use a WindowRenderTarget and create my bitmaps without calling BeginDraw? If not, is it possible to resize the D3D swap chain without having to reset the render target? Any help would be greatly appreciated.
  15. Mimicking ShowDialog

    Thanks krippy2k8 you are exactly right, and I figured out why the app appears to freeze inside the loop. I am using the Application.Idle event for my main loop, so obviously the application is not idle during the while loop, even when DoEvents() is used. Here is my working code for future reference. ZWindow: [source] public bool Idle { get { NativeMethods.Message msg; return !NativeMethods.PeekMessage(out msg, IntPtr.Zero, 0, 0, 0); } } private void Application_Idle(object sender, EventArgs e) { while (this.Idle) Render(); } internal void Render() { if (!p_Direct3D.Valid) p_Direct3D.ResetDevice(); p_Direct3D.BeginRender(this.BackColor); foreach (ZControl control in p_Controls) { control.Render(); } p_Direct3D.EndRender(); } [/source] ZDialog: [source] public DialogResult ShowDialog() { Show(); while (p_DialogResult == DialogResult.None) { Application.DoEvents(); ParentWindow.Render(); } return p_DialogResult; } [/source] Thanks again!