Jump to content

  • Log In with Google      Sign In   
  • Create Account

SagoO

Member Since 17 Mar 2011
Offline Last Active Apr 16 2011 01:17 AM

Topics I've Started

Fail to read the offscreen buffer

07 April 2011 - 08:27 AM

HI,

I have create a offscreen rendering buffer. I have a fragment program jz simple as this:

static const char *fragment_source = {

	"void main(void)"
	"{"
			"gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);"
	"}"


Then, I create a offscreen rendering FBO:

/*Create a fbo to perform offscreen rendering*/
	glGenFramebuffersEXT(1,&fbo);
	glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,fbo);

	/*Create render object*/
	glGenRenderbuffersEXT(1,&renderbuffer);
	glBindRenderbufferEXT(GL_RENDERBUFFER_EXT,renderbuffer);

	glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT,GL_RGBA,noOfpacket,size);
	glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT,GL_COLOR_ATTACHMENT0_EXT,GL_RENDERBUFFER_EXT,renderbuffer);

  	status = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);

  	switch(status)
  	{
 			case GL_FRAMEBUFFER_COMPLETE_EXT:
 				printf("***FBO complete***\n\n");
			break;
  		default:
 				fprintf(stderr, "Error: %s\n", gluErrorString(status));
			break;
      }

	glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,fbo);

Then, I read the buffer:

/*Read values back*/
	/*Set the target framebuffer to read*/
	glReadBuffer(GL_COLOR_ATTACHMENT0_EXT);

	/*Read pixels from framebuffer to CPU buffer*/
	glReadPixels(0,0,size,noOfpacket,GL_RGBA,GL_UNSIGNED_BYTE,result);
checkError("glReadPixels error!");
		printf("Data in frame buffer:\n");
	for (i=0;i<noOfpacket*size;i++)
    		printf("%d",result[i]);
	printf("\n\n");

	glBindRenderbufferEXT(GL_RENDERBUFFER_EXT,0);
	glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,0);

Well. I got all zeros all the time in the frame buffer. That is impossible since my fragment should output something. I never unbound my FBO. So, the shader should render to the offscreen FBO. Any one face this problem before?

Help please.... Thank for your reply.

Lack of understanding on fragment shader?

24 March 2011 - 10:47 PM

I have a texture 2048x2048.
I want to process the texture line by line in parallel. But I dun know how to program my shader program.
Shader can be used to process each vertex in parallel. But I dun understand how it works? Can anyone explain further?

How to read vertex array in vertex shader?

17 March 2011 - 11:44 AM

Hi,

I have a program which will send thousands of array with size 2048 bytes each array. I uses VBO here.

for(p=0;p<noOfpacket;p++){

		for(q=0;q<2048;q++){
			gpu_payload[q] = packet_payload[p][q];
		}
		glGenBuffersARB(1, &vboID[p]);
		glBindBufferARB(GL_ARRAY_BUFFER_ARB, vboID[p]);
		glBufferDataARB(GL_ARRAY_BUFFER_ARB, sizeof(gpu_payload), gpu_payload, GL_STATIC_DRAW_ARB);

	}

After I send the data into the buffer, I want to process each of the array by using vertex shader. How to manipulate the vertex array in lower level?
Is it the vertex shader able to process each buffer object in parallel? That is what I want to reach.

Help please =.=

PARTNERS