how to flush 3D buffer into a bitmap

Started by
2 comments, last by Gaspos 15 years, 5 months ago
Hi, after trying some "hello world" OpenGL programs, I wonder how to flush a "3D buffer" into any kind of RGB bitmap. Let's say I drew some polygons with cute colors and nice lights but instead of render them on screen I just want to get the generated image into a good old RGB bitmap. Do you have any idea of what to do ? May be something with some "glBufferData" ? Thanks in advance... Gaspos
Advertisement
Quote:Original post by Gaspos
Let's say I drew some polygons with cute colors and nice lights but instead of render them on screen I just want to get the generated image into a good old RGB bitmap.
You can use glReadPixels to read back some or all of the current render buffer into your own memory buffer.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

You can take a look at the render to texture method in this NeHe lesson. It doesnt show how to write this to a file but you can find that on the net pretty easy.
Thanks guys !
“glReadPixels” was exactly what I needed !

But now that this is working, I felt on a new tricky issue about OpenGL context…
I also posted this message as a new thread.

Here is the whole story :
I’m developing (with Visual Studio C++ 2005) a screen saver (yet an other one, sigh…) that is build on plugins :
The main part (.exe) manages all the windows/display/events stuff.
The plugins (.dll) produce the bitmap to be displayed (I already made some nice ones).
A plugin has some entry points as “init” (that is called once, at beginning) and “next” (that is iteratively called until the mouse is moved).

Now I’m trying to create a new plugin (ie dll) that uses OpenGL to compute 3D images. (That’s why I needed to get back the bitmap from OpenGL)
In the “init” function I init all the OpenGL stuff :
glutInit( &argc,argv ) ;
glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH ) ;
glutInitWindowSize( sizex,sizey ) ;
glutCreateWindow( "dummy" ) ;
glClearColor( 0.0f,0.0f,0.0f,1.0f ) ;
glShadeModel( GL_SMOOTH ) ;
glFrontFace( GL_CW ) ;
glCullFace( GL_BACK ) ;
glEnable( GL_CULL_FACE ) ;
blah blah…

In the “next” function I render the scene and get the bitmap :
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ) ;
glLoadIdentity() ;
glBegin( GL_QUADS ) ;
glVertex3d …
blah blah…
glEnd() ;
glFlush() ;
glutSwapBuffers() ;
glutPostRedisplay() ;
glReadPixels( 0,0,sizex,sizey,GL_RGB,GL_UNSIGNED_BYTE,bitmap ) ;

So… here is the issue :
In the “next” function, nothing works… the whole OpenGL context (ie state) seems to be lost.
If I hack my code and call directly “next” from “init”, it works fine.
It is like if returning from a DLL function makes OpenGL lose its context.
I don’t know about calling a DLL (OpenGL32.dll) since another DLL (my plugin)…
Do you have any idea on how OpenGL manages its own context ?
I thought that since my DLL is loaded in memory, its own DLLs are also maintained in memory with their own context… but I may be wrong…

Thanks for any advice.

This topic is closed to new replies.

Advertisement