Jump to content
  • Advertisement
Sign in to follow this  
dujays

OpenGL FBO + MRT + Float textures

This topic is 4445 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

In OpenGL, can you render to multiple float textures from a fragment shader? I have successfully rendered to multiple 8bit textures, but once I change my internal format from GL_RGBA8 to GL_FLOAT_RGBA32_NV (using a target of GL_TEXTURE_RECTANGLE_NV for both), weird thigns start to happen. As in nothing appears and things become unresponsive until the program is closed. I am checking for GL errors (I see none) and verifying that my FBO is complete. On Linux, latest nvidia drivers, tested on 7800 GTX. Thanks

Share this post


Link to post
Share on other sites
Advertisement
Can you render to a single floating point texture (fpt)?

If using fpt you should also, besides changing the internal format, make sure your type is GL_FLOAT. Also you might wanna try using GL_RGBA32F_ARB instead of GL_FLOAT_RGBA32_NV.

Share this post


Link to post
Share on other sites
RGBA32F_ARB (or other float targets) does not make a difference. I can write to 1 float texture successfully. I can write to 2 float textures as longs as I write the same value (using gl_FragColor). But, if I write to both gl_FragData[0] and gl_FragData[1] it fails. And it only fails when writing to a float texture.

Oh well...

Share this post


Link to post
Share on other sites
IIRC, when using MRT, the format/type of the render targetS must sum up to equal the same number of bytes of the framebuffer. So for example if your program is using a 32 bits framebuffer (let's say 8 bits per channel, with channels RGBA), your render targets could be (1. R8G8B8 & 2. Luminance8). That's something I read in some NVIDIA papers I think.

Can someone confirm this?

Share this post


Link to post
Share on other sites
Nvidia's paper on deferred shading (http://download.nvidia.com/developer/presentations/2004/6800_Leagues/6800_Leagues_Deferred_Shading.pdf#search=%22nvidia%20deferred%20shader%22) states that all render targets attached to a FBO must have the same number of bits. I don't see why it would need to match the framebuffer. Rendering to one float texture (128bits) works correctly, and the framebuffer is only 24bit.

Share this post


Link to post
Share on other sites
I believe I had the same issue as you. Find the post here -> http://www.gamedev.net/community/forums/topic.asp?topic_id=412830

I filed a bug with nvidia, but they required a test case and my company refused to give out the source code; we moved to being windows only...

Its a shame really....

Share this post


Link to post
Share on other sites
I figured out my problem. I don't know if this is documented anywhere or not (and, if so, an GL error would be better than the current behavior), but calling glClear(GL_COLOR_BUFFER_BIT) after binding the fbo and setting the draw buffers was cauing my problem. Of course, this is not a problem if one is not doing MRT with float textures.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!