Jump to content
  • Advertisement
Sign in to follow this  
tmason

OpenGL Alternative to glFramebufferTexture for OpenGL version 3.1

This topic is 1675 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello,

I am trying to create an OpenGL 3.1 application for the Oculus Rift and one item it seems to require is a frame buffer texture but the function glFramebufferTexture is only available for versions 3.2 and higher.

Is there a function that mimics this via a hack or something for OpenGL 3.1?

Thank you for your time.

Share this post


Link to post
Share on other sites
Advertisement

Why the contstraint? Any GL 3.1 hardware can run GL 3.2; it's just a driver issue...

 

Indeed. From Wikipedia:

OpenGL 3.3

Release Date: March 11, 2010

OpenGL 3.3 was released alongside version 4.0. It was designed to target hardware capable of supporting Direct3D 10.

 

You probably should be able to support OpenGL 4.x by now. It's already 4 years old. But, personally I just run with 3.3 because I've never had anyone tell me "it didn't start," unless we're talking school-provided laptop.

EDIT: Scratch that, the school-provided laptop my brother had did support 3.x, it just didn't support more than 256 tiles for GL_TEXTURE_2D_ARRAY. No problem at all.

 

What exactly is the computer/GPU you have? Did you update to the latest drivers?

Edited by Kaptein

Share this post


Link to post
Share on other sites

Why the contstraint? Any GL 3.1 hardware can run GL 3.2; it's just a driver issue...

Or even OpenGL 3.3, which is a lot more "sane" in several respects (mostly shading langugae) as compared to 3.1.

Share this post


Link to post
Share on other sites

why not use glFramebufferTexture1D/2D/3D instead?

 

Not sure if the Oculus SDK supports it; I will try and see what happens.

Share this post


Link to post
Share on other sites

Why the contstraint? Any GL 3.1 hardware can run GL 3.2; it's just a driver issue...

 

Laptops and basic desktops :)

 

Many of the ones out there that are 3-4+ years old don't have 3.2-3.3 support.

 

And the app I am developing is set to target a decent range of systems that are 3-5 years old (6 years old, maximum)...

Share this post


Link to post
Share on other sites

 

Why the contstraint? Any GL 3.1 hardware can run GL 3.2; it's just a driver issue...

 

Indeed. From Wikipedia:

OpenGL 3.3

Release Date: March 11, 2010

OpenGL 3.3 was released alongside version 4.0. It was designed to target hardware capable of supporting Direct3D 10.

 

You probably should be able to support OpenGL 4.x by now. It's already 4 years old. But, personally I just run with 3.3 because I've never had anyone tell me "it didn't start," unless we're talking school-provided laptop.

EDIT: Scratch that, the school-provided laptop my brother had did support 3.x, it just didn't support more than 256 tiles for GL_TEXTURE_2D_ARRAY. No problem at all.

 

What exactly is the computer/GPU you have? Did you update to the latest drivers?

 

 

The laptop I have now, a Dell Latitude XT3, only supports OpenGL 3.1.

 

I need a setup that can give people a wide range of support. OpenGL 3.3+ is still too new (3-5 years new) in the general market.

Edited by tmason

Share this post


Link to post
Share on other sites

why not use glFramebufferTexture1D/2D/3D instead?

 

I'd up-vote you more if I can, this works!

Share this post


Link to post
Share on other sites

 

The laptop I have now, a Dell Latitude XT3, only supports OpenGL 3.1.

 

I need a setup that can give people a wide range of support. OpenGL 3.3+ is still too new (3-5 years new) in the general market.

 

 

It seems you are right.

You can still get all these things through extensions. Unfortunately for you that means glFramebuffer*EXT, which is like drinking poison. It will work though.

Not to mention your drivers and your hd3000 do support dx 10.1, so you should be able to use any feature from 3.x. They never added these features I guess.

I heard the new Intel drivers are supposed to have 4.x support.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!