Jump to content
  • Advertisement
Sign in to follow this  
madmax46

OpenGL Render to texture not working with nvidia 7600 GT

This topic is 2997 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have an application that does a render to texture using the frame buffer extension. It binds a texture to the color attachment using GL_LUMINANCE with byte data and it uses a single render buffer for the depth and stencil buffer. I've tested the code on 3 different cards. It works great on a nvidia GeForce 9400 and Quadro FX 580 but fails to work on the 7600 GT. I get back GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT from glCheckFramebufferStatusEXT(GL.GL_FRAMEBUFFER_EXT) on the 7600 and GL_FRAMEBUFFER_COMPLETE_EXT on the other 2.

Edit: I thought maybe since I was getting GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT and not GL_FRAMEBUFFER_UNSUPPORTED_EXT that I might be doing something wrong.

Edit2: I thought it might be that the sizes were not powers of 2 but I tried that and still get the INCOMPLETE status and it doesn't show up.

Edit3: I was looking around and saw this It talks about a bug in nvidia's driver for RTT. I was thinking this could be the case but the other nvidia cards work with the same driver so I'm not sure. I was then looking at this and as far as I can tell everything is attachment complete, otherwise it wouldn't work on the other cards. I'm thinking it might be the mipmap thing since I don't use mipmaps or it could be the GL_LINEAR thing because I am using that. Does anyone know what an "integral texture" is? I'm not sure if I am using them or not.

Thanks for any info,

Max

[Edited by - madmax46 on July 7, 2010 2:24:35 PM]

Share this post


Link to post
Share on other sites
Advertisement
Well, got it working with some outside help (thanks Pat!). Basically the card doesn't support render to texture using GL_LUMINANCE, so I modified my code to attempt to initialize using GL_LUMINANCE and if I get the error code, dispose and reinitialize using RGB8. The only thing different I have to do is create my byte array 3 times as large and it all works the same!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!