# FBO: Render to Texture not working!

## Recommended Posts

Okay, I wish I would have learned this before, but for my mobile game, I need to render my scene to a texture on devices with smaller resolutions such as the iPhone.  Since my game works best on mobile and touch oriented devices, that's primarily what I'm aiming for.  Now, the biggest issue with certain mobile devices is screen resolution.  A 480x320 resolution is far too small, so my idea is to double the size by rendering to a texture (FBO), and scale touch locations accordinly.  It's kinda hard to explain, but my game requires lots of rapid movements which require a fairly large screen resolution to pull off, and there's no need for this on iPads and what not.  If there's a better solution, feel free to share it.

Once again, I hate dumping code on you all, but I really must get this working asap.  These are the steps I've taken:

1. Create the FBO, RBO, and a texture, and bind them together.

2. Clear the screen to white.

3. Bind the FBO.

4. Clear the screen and draw as normal.

5. Unbind the FBO.

6. Draw a screen sized quad with the results.

So far, whenever I try, nothing renders.

FBO creation and usage:

bool create_render_target( int width, int height, rendertarget_t* rt )
{
glGenFramebuffers(1, &rt->fbo);
glBindFramebuffer(GL_FRAMEBUFFER, rt->fbo);

glGenRenderbuffers(1, &rt->rbo);
glBindRenderbuffer(GL_RENDERBUFFER, rt->rbo);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA, width, height);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, rt->rbo);

glGenTextures(1, &rt->tex);
glBindTexture(GL_TEXTURE_2D, rt->tex);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
//glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA,  width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, rt->tex, 0);

GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if(status != GL_FRAMEBUFFER_COMPLETE)
{
printf( "ogldrv_t::create_render_target(): failed to make complete framebuffer object 0x%X\n", status);
return false;
}

glBindFramebuffer( GL_FRAMEBUFFER, 0 );

return true;
}

void set_render_target( rendertarget_t* rt )
{
GLenum error = glGetError();

if( rt )
glBindFramebuffer( GL_FRAMEBUFFER, rt->fbo );
else
glBindFramebuffer( GL_FRAMEBUFFER, 0 );

if( (error = glGetError()) != GL_NO_ERROR )
printf( "Error binding render target!\n" );
}



Example usage:

This->m_ogldrv->set_render_target( &rt );
This->reshape_func( width, height );
This->display_func();
This->m_ogldrv->set_render_target( NULL );


Without the rendertarget stuff, it works fine.  But when I do use it, nothing renders afterwards, all I see is the colour of the screen I set it to be cleared to.  I've spent hours tweaking and googling, but no success so far.  Would it have to do with the size of my render target?  This sucks.  Any ideas?  Thanks.

Shogun.

##### Share on other sites

1.) In create_render_target you are creating a render buffer as well as a texture, and attach both to the same attachment point COLOR_ATTACHMENT0. That makes no sense. Use a render buffer or else a texture for a single attachment point. Since you probably want to use the result later for mapping, it should probably be a texture.

2.) Does rendering require depth buffer? Your FBO lacks a depth buffer.

EDIT: No really related to the problem, but glBindFrameBuffer( GL_FRAMEBUFFER, 0 ) should also be invoked when returning due to an incomplete buffer.

Edited by haegarr

##### Share on other sites

1. So, don't use the render buffer?  I commented that out, and it still doesn't work.

2. Same thing with or without it.  My game is 2D and uses the painter's algorithm, so it's not needed.

3. Yeah, I forgot about that.

Shogun.

Edited by blueshogun96

##### Share on other sites

A guess: Are width and height both POTs? You use GL_TEXTURE_2D as attachment. Maybe you need to use GL_TEXTURE_RECTANGLE.

I see that you speak of "doubling 320 x 480" ... that may hint at that they are not POTs.

Edited by haegarr

##### Share on other sites

Be aware that this solution will probably look shit.

Sorry, but it's a fact of life that scaling algorithms produce artifacts that are not pleasing to the eye.

I have used this technique on several apps, but always when I knew the screen resolutions in advance and could adjust the graphics to be sized as factors of the supported screen sizes. So a 160 by 200 graphic will look fine at 320 by 400 and 640 by 800 and 320 by 200 but crap at 640 by 480.

I tend to use a layout system for the display now, it's just looks better even though it's more code.

##### Share on other sites

^ If it comes to that, I'll deal with it.  And if it's too unbearable, then I'll drop support for iPhone altogether.

A guess: Are width and height both POTs? You use GL_TEXTURE_2D as attachment. Maybe you need to use GL_TEXTURE_RECTANGLE.

I see that you speak of "doubling 320 x 480" ... that may hint at that they are not POTs.

It doesn't matter.  I tried POT and NPOT alike.  Haven't tried the texture rectangle; it doesn't sound like a bad idea.

This is the article I was basing my code off of: https://developer.apple.com/library/ios/documentation/3ddrawing/conceptual/opengles_programmingguide/WorkingwithEAGLContexts/WorkingwithEAGLContexts.html

It doesn't appear to mention anything about texture dimension requirements.  My game uses NPOT textures anyway.

Shogun.

##### Share on other sites

Also be aware that doubling the resolution will also increase your fill-rate which will in turn affect your performance, for your use case rendering above the native resolution probably won't gain anything since it will still have to be downsized to fit the screen anyways.

##### Share on other sites

If not, then I guess I'll just drop support for iPhones and only support iPads as far as iOS devices are concerned.  I'm aware that iOS devices tend to have a lower speed when it comes to fill rates (especially older iPads), and my game has had issues with that in the past.  So yeah.

Either way, I still need to find a solution to this FBO problem because for Android devices like the Galaxy phones and Nexus tablets, the resolution is too large, and subtracts from the game experience.  For Android devices, I'm aiming to limit the resolution to 720p.

Shogun.

##### Share on other sites

Without the rendertarget stuff, it works fine.  But when I do use it, nothing renders afterwards, all I see is the colour of the screen I set it to be cleared to.  I've spent hours tweaking and googling, but no success so far.  Would it have to do with the size of my render target?  This sucks.  Any ideas?  Thanks.

According to your code, you bound a temporary render-buffer here:

glBindRenderbuffer(GL_RENDERBUFFER, rt->rbo);

But you never bound back the renderbuffer that is used to present to the screen (the one you created with renderbufferStorage: fromDrawable:—this is the only renderbuffer that is capable of being presented to the UIView/screen).

That aside, why aren’t you just using UIView.contentScaleFactor?

Set it to a low number if you want the device to automatically scale contents upward.  Setting it to 2.0 means using double resolution (retina mode).  It can be set to any value from epsilon to 2.0f.

This is the common way around fill-rate issues, and you should never have any fill-rate issues on any iOS device if used properly.

L. Spiro

Edited by L. Spiro

##### Share on other sites

I didn't know about contentScaleFactor, so I'll attempt to use that instead.

So far, it doesn't appear to be working.  It's as if no matter what value I set it to, it just stays the same.  The results returned from [[UIScreen mainScreen] bounds] are the same as usual (320x480).  Am I doing something wrong?

Shogun.

EDIT: Had to do a bit of research on this.  I had to modify CAEAGLLayer.contentScale as well, then had to manually scale the results of [[UIScreen mainScreen] bounds] by 2, as well as the touch results.  Works fine in the simulator so far.  Thanks.

Edited by blueshogun96

##### Share on other sites

[[UIScreen mainScreen] bounds] should not change.  That tells you the maximum contentScaleFactor, which will be 2.0f on retina devices and 1.0f otherwise. The UIView into which you draw (the UIView you use to call renderbufferStorage: fromDrawable:) is what changes, and it will also affect the sizes you pass to renderbufferStorage: fromDrawable:, so be aware.

L. Spiro

##### Share on other sites

Oops, didn't catch your last response (should have reloaded).  But it's working so far, in the simulator at least.

Shogun.

##### Share on other sites

BTW, I think these calls:

glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

Should use glTexParameteri, not glTexParameterf as far as I know.

Then this call:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);

Should be:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);

"Internal format" parameter specifies the bit quantity (8 bits per pixel in this case).

You might also need to call glDrawBuffers with the color attachment you're writing to (ie, GL_COLOR_ATTACHMENT0), then with GL_BACK_LEFT when you're writing to the default framebuffer (I made the mistake before of calling it with GL_FRONT_LEFT and writing to the front buffer, messing double buffering along the way).

Also, you attach render buffers (RBOs) when you'll never need to sample from the attachment, they're sort of like the non-readable cousin of immutable textures, and used normally for the depth-stencil attachment. If you'll need to sample from the FBO, attach regular textures, then just bind the texture to sample from it.

Edited by TheChubu

## Create an account

Register a new account

• ### Forum Statistics

• Total Topics
628372
• Total Posts
2982305

• 10
• 9
• 13
• 24
• 11