FBO + GL_RGBA_FLOAT32_ATI?
Some effects I need to implement seem to require the use of a texture in GL_RGBA_FLOAT32_ATI format. Trouble is, I want to be able to modify this texture using FBO's. When I try to do so, my FPS drops from 100+ to maybe 0.05.
Are FBO's incompatible with GL_RGBA_FLOAT32_ATI or is there simply something particular that needs to be done in order for this format not to affect performance?
GL_RGBA_FLOAT32_ATI works perfectly fine with FBOs, I use it myself. Make sure you create your floating point textures correctly, using format GL_RGBA and type GL_FLOAT and filters as GL_NEAREST.
You should get error messages from the fbo if something is wrong, make sure you check this using glCheckFramebufferStatusEXT.
You should get error messages from the fbo if something is wrong, make sure you check this using glCheckFramebufferStatusEXT.
I'm using a GF 6800 with the latest drivers. I don't think I was using the GL_RGBA format - IIRC I was using GL_RGB. Maybe that's what's wrong. I'll make sure to check up on it tonight. I'll reply here again if it doesn't work.
Thanks.
Thanks.
Make sure you only use the GL_NEAREST filter on 32bit floating point textures, otherwise it will run in software.
Jep, Expandable is right.
If you really nead GL_LINEAR then you should check if FLOAT16 is sufficient for you.
With FLOAT16, GL_LINEAR and Mipmapping are Supported on Nvidia Cards (GF6 upwards). Currently ATI doesn't support it, but it will do with the next generation.
If you really nead GL_LINEAR then you should check if FLOAT16 is sufficient for you.
With FLOAT16, GL_LINEAR and Mipmapping are Supported on Nvidia Cards (GF6 upwards). Currently ATI doesn't support it, but it will do with the next generation.
This is the code I use, which produces horrible performance:
If I use the code as it is above, the buffer is written to when I enable it and I can see the results if I display the texture on a mesh, but performance is abysmal. If I instead use the commented line with GL_RGBA8, performance is great, but I can't use that because apparently, textures in that format can't be used as textures in the vertex shader (which is what I need the texture for). If I use the commented line with GL_LUMINANCE, nothing is written to the buffer (at least nothing gets displayed), although apparently that texture format can be passed to the vertex shader so if I could get that to work it would be just as good as getting GL_RGBA_FLOAT32_ATI to work.
glCheckFramebufferStatusEXT() claims the framebuffer is ok.
Note: I do use another framebuffer in the program with GL_RGBA8 format, using GL_COLOR_ATTACHMENT0_EXT. Could that be an issue? I tried using another color attachment, but if I do, nothing gets rendered to the buffer.
displacementbuffer = 0;glGenFramebuffersEXT(1, &displacementbuffer);glBindframebufferEXT(GL_FRAMEBUFFER_EXT, displacementbuffer);glGenTextures(1, &displacementmap);glBindTexture(GL_TEXTURE_2D, displacementmap);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);//glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_FLOAT32_ATI, width, height, 0, GL_LUMINANCE, GL_FLOAT, 0);glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA_FLOAT32_ATI, width, height, 0, GL_RGBA, GL_FLOAT, 0);//glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_RGB, GL_FLOAT, 0);glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, displacementmap, 0);
If I use the code as it is above, the buffer is written to when I enable it and I can see the results if I display the texture on a mesh, but performance is abysmal. If I instead use the commented line with GL_RGBA8, performance is great, but I can't use that because apparently, textures in that format can't be used as textures in the vertex shader (which is what I need the texture for). If I use the commented line with GL_LUMINANCE, nothing is written to the buffer (at least nothing gets displayed), although apparently that texture format can be passed to the vertex shader so if I could get that to work it would be just as good as getting GL_RGBA_FLOAT32_ATI to work.
glCheckFramebufferStatusEXT() claims the framebuffer is ok.
Note: I do use another framebuffer in the program with GL_RGBA8 format, using GL_COLOR_ATTACHMENT0_EXT. Could that be an issue? I tried using another color attachment, but if I do, nothing gets rendered to the buffer.
You're texture-code looks ok to me. Also, using many FBOs with GL_COLOR_ATTACHMENT0_EXT should not cause any problem.
You could try this program which will create a bunch of floating point texture formats and attach them to FBOs, saving the results to a file. So you will know if there's something wrong with your card/driver.
http://www.mathematik.uni-dortmund.de/~goeddeke/gpgpu/basic_math_tutorial_cg.zip
You could try this program which will create a bunch of floating point texture formats and attach them to FBOs, saving the results to a file. So you will know if there's something wrong with your card/driver.
http://www.mathematik.uni-dortmund.de/~goeddeke/gpgpu/basic_math_tutorial_cg.zip
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement