Jump to content
  • Advertisement
Sign in to follow this  
shaade

OpenGL render to vertex array with EXT_pixel_buffer_object

This topic is 4901 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hello everyone! this is my first post, so please don't be too hard if I say something silly... I'm trying to implement render-to-vertex-array functionality in a platform independent way. The idea is to pass the geometry to the graphics card (NVidia GeForce 6600GT) as a texture, render it to a quad with a fragment program (with GLSL), read it back from the frame buffer into graphics memory and either alter it with a new vertex program or render it directly as a vertex array with the standard OpenGL fixed functionality. I've searched quite extensively and finally I've found out that the extension EXT_pixel_buffer_object should allow me to do exactly what I want... but (there's always a but) I can't get anything but a black screen... and with an OpenGL debugger I only can assure that my texture (GL_RGB) with the geometry is properly loaded on the card. So it must be either a problem with the read back to the buffer object or with the specification of the vertex array. Here is a fragment of my code illustrating the problem:
// FIRST PASS: Render texture to a quad
//---------------------------------------
glUseProgramObjectARB( myGLSLProgram );
GLuint myBufferIDs = new GLuint[1];
glGenBuffersARB( 1, myBufferIDs );
glBindBufferARB( GL_PIXEL_PACK_BUFFER_EXT, myBufferIDs[0] );
glBufferDataARB( GL_PIXEL_PACK_BUFFER_EXT, 3*numOfVertices, NULL, GL_DYNAMIC_DRAW );
glBindBufferARB( GL_PIXEL_PACK_BUFFER_EXT, 0 );
glDrawBuffer( GL_FRONT );
renderTextureToQuad();

// SECOND PASS: Read back from framebuffer and render vertex array
//----------------------------------------------------------------
glUseProgramObjectARB( 0 );
glBindBufferARB( GL_PIXEL_PACK_BUFFER_EXT, myBufferIDs[0] );
glReadBuffer( GL_FRONT );
glReadPixels( 0, 0, textureSize, textureSize, GL_RGB, GL_FLOAT, (char *)NULL );
glBindBufferARB( GL_PIXEL_PACK_BUFFER_EXT, 0 );
glBindBufferARB( GL_ARRAY_BUFFER, myBufferIDs[0] );
glEnableClientState( GL_VERTEX_ARRAY );
glVertexPointer( 3, GL_FLOAT, 0, (char *)NULL );
for( int stripNr = 0; stripNr < totalNumOfStrips; stripNr++ ) {
    glDrawArrays( GL_TRIANGLE_STRIP, firstVertIndx, thisStripNumOfVerts );
}

So, what I'm doing is pretty much a reproduction of the example from the extension specification and I think it should work. Does anyone have any experience with this extension and render-to-vertex-array? Supposed that I'm doing something wrong and the extension can be actually used for this purpose, is there any restriction on the use of vertex arrays (indexed verts, interleaved attributes...)? Thank you for any help/comment!

Share this post


Link to post
Share on other sites
Advertisement
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!