Archived

This topic is now archived and is closed to further replies.

LowCalorieSoftDrink

Memory Device Contexts

Recommended Posts

Is it possible to create, and render to a memory device context in opengl. When i try to do this my computer restarts when i try to set the pixel format. Below is the code (its in OO-Pascal but it should be pretty obvious whats going on to all you c++ people):
procedure TForm1.FormCreate(Sender: TObject);
var
  DC: HDC;
  RC: HGLRC;
  Bitmap: HBITMAP;
  PixFormat: Cardinal;
  PixDesc: TPixelFormatDescriptor;
begin

  FillChar( PixDesc, sizeof(PixDesc), 0 );

  with PixDesc do
  begin
    nSize        := sizeof(TPixelFormatDescriptor);
    nVersion     := 1;
    dwFlags      := PFD_DRAW_TO_WINDOW or PFD_SUPPORT_OPENGL or PFD_DOUBLEBUFFER;
    iPixelType   := PFD_TYPE_RGBA;
    cColorBits   := 32;
    cDepthBits   := 16;
    cStencilBits := 0;
    iLayerType   := PFD_MAIN_PLANE;
  end;

  DC := CreateCompatibleDC(0);

  Bitmap := CreateCompatibleBitmap( DC, 1, 1 );

  SelectObject( DC, Bitmap );

  PixFormat := ChoosePixelFormat( DC, @PixDesc );

  SetPixelFormat( DC, PixFormat, @PixDesc );

end;
Thanks for any help you can offer

Share this post


Link to post
Share on other sites
when you set your pixel format descriptor, you can choose to render to a bitmap (one of those unsupported ones).

A really cheap and dirty work around is to make a panel, make it non-visible, and use it''s hand with for the hdc. something like

GetHDC(Panel1->Handle);

or whatever the function is.

You can render to it pretty easily then. If you need to read the data off it, I recommend using glreadpixels in either the GL_BITMAP or the GL_UNSIGNED_BYTE format instead of reading from the panel''s bitmap.

Either way, you won''t need to double or triple buffer it as just glClear functions and glreadpixels don''t need it.

Share this post


Link to post
Share on other sites