Jump to content
  • Advertisement
Sign in to follow this  
mrr

OpenGL New and old opengl, bad combos?

This topic is 2678 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi

Ok this has probably come up before..but I cant find any good threads

I want to use the new features of opengl 3 to 4.1.

Though many things are deprecated in the specifications.
I know NVIDIA says all the old functionality remains and that you should use compatibility mode.

Ok so Ive decided to do just that :)

But questions remain:

What does ATI say?
Are there combinations of old opengl and new that should not be mixed?
What is a good way of writing new opengl and what is bad ?

Share this post


Link to post
Share on other sites
Advertisement
Compatibility does not mean mixing new and old. It means one or the other. You might be able to get some older functions working on the newer context, but I dont know if or what cases that will be.

Share this post


Link to post
Share on other sites

Compatibility does not mean mixing new and old. It means one or the other. You might be able to get some older functions working on the newer context, but I dont know if or what cases that will be.


I'm not sure what that is suppose to mean but when you set the compatibility bit, you can use all the deprecated functions along with whatever feature GL 3.0 or whatever version you are using.
The reason why "compatibility bit" was added was because certain companies were complaining. Mostly CAD companies. At least that's what the ARB declared.

I know NVIDIA says all the old functionality remains and that you should use compatibility mode[/quote]

nVidia didn't say you should use compatibility mode.
They said they will offer it with ALL future GL version because that's what certain companies want.

ATI/AMD along with other members disagree with nVidia. They wanted to enforce deprecation.

Share this post


Link to post
Share on other sites

[quote name='dpadam450' timestamp='1298325907' post='4777245']
Compatibility does not mean mixing new and old. It means one or the other. You might be able to get some older functions working on the newer context, but I dont know if or what cases that will be.


I'm not sure what that is suppose to mean but when you set the compatibility bit, you can use all the deprecated functions along with whatever feature GL 3.0 or whatever version you are using.
The reason why "compatibility bit" was added was because certain companies were complaining. Mostly CAD companies. At least that's what the ARB declared.


So old and new can be mixed without problems ?

I know NVIDIA says all the old functionality remains and that you should use compatibility mode[/quote]

nVidia didn't say you should use compatibility mode.

They said they will offer it with ALL future GL version because that's what certain companies want.

ATI/AMD along with other members disagree with nVidia. They wanted to enforce deprecation.
[/quote]


Share this post


Link to post
Share on other sites
Yes, you can use deprecated stuff. There are exceptions and they are explained in the OpenGL specification. For example, using GL_TEXTURE_2D_ARRAY requires that you use shaders because it wouldn't make any sense when using them from fixed function pipeline.
There are texture formats that make more sense when used with a shader. Floating point formats like GL_RGB32F, GL_R32F. Integer formats like GL_RGBA16_SNORM. Integer formats like GL_RGBI, GL_RGB32UI, GL_RG8I, GL_RB8UI.
There are vertex attribute formats like integer formats that make more sense to use in a shader.

Share this post


Link to post
Share on other sites

Yes, you can use deprecated stuff. There are exceptions and they are explained in the OpenGL specification. For example, using GL_TEXTURE_2D_ARRAY requires that you use shaders because it wouldn't make any sense when using them from fixed function pipeline.
There are texture formats that make more sense when used with a shader. Floating point formats like GL_RGB32F, GL_R32F. Integer formats like GL_RGBA16_SNORM. Integer formats like GL_RGBI, GL_RGB32UI, GL_RG8I, GL_RB8UI.
There are vertex attribute formats like integer formats that make more sense to use in a shader.


Ok, I think I need to start reading a bit.
Thanks

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!