Jump to content
  • Advertisement
Sign in to follow this  
japro

OpenGL OpenGL 4.3 - compute shaders and much more

This topic is 2171 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
OpenGL compute shaders... or 'opps, we got it wrong and MS got it right... quick back track!'

I skimmed a few other things; basically bringing the features up to D3D11 level and continuing the OpenGL tradition of 'here are 101 ways to do things... good luck with that!'

In fact could someone give me an update on the state of Direct State Access in OGL? 4.3 doesn't seem to have it as a feature and last I checked it was a case of some things and not all things...

Share this post


Link to post
Share on other sites

For us lazy programmers, please give a short summary of the benefits or disadvantages, as you see it!

I think people that are more involved with all this can give more competent insights. g-truc has a nice review: http://www.g-truc.net/post-0494.html#menu

I literally just saw this hours ago and have still to go through all of it. I am mostly excited for compute shaders. The other additions I looked into seemed to be some obvious fixes ("layout(location =...)" for uniforms, imageSize etc.) as well as improvements to memory related aspects that play nice with compute shaders (ARB_shader_storage_buffer_object).

Share this post


Link to post
Share on other sites
Not much to be excited about it seems, i think most of it has been available as extensions for ages as usual, only thing i find interesting is the texture parameter queries and shader storage buffers allthough i guess its just the old ext_texture_buffer in a new package, ES3 compatibility might be nice if we get some ES3 capable devices to play with.

It would be nice if moderators didn't try to start an API flamewar though, i think we get enough of those.

Edit: To phantom, i never said you weren't correct. i've replied in a PM to you instead to avoid derailing this thread. Edited by SimonForsman

Share this post


Link to post
Share on other sites

It would be nice if moderators didn't try to start an API flamewar though, i think we get enough of those.


Really?

Tell me what was wrong with my statements?

Computer Shaders - admission that OpenCL/GL interop has failed to work.
Features - brings it up to D3D11 standard but, with at least one extension, introduces yet another way to do things
DSA - genuine question about the state of it...

Note: no where did I say 'D3D11 is better!' - all I did was call them out on areas they are still lacking - which is the API interface in general and MAYBE the state of DSA, which I asked about..

So, yeah, if not fawning over a new release of an incremental update to an outdated API is 'starting a flame' war then fine, I started a flame war...

Share this post


Link to post
Share on other sites
phantom is actually correct, although the wording chosen can certainly come across as "let's start a flame war" - but look beyond that at what the update actually does have to offer.

All the same, I'm feeling moderately stoked about this update, despite there being a high percentage of "things that should have been done ages ago" in it. There are some genuine API usability improvements in there, and the push for standardised and patent-free texture compression formats seems well-intentioned at least. Let's have the ability to use FBOs with the default depth/stencil buffer in the next one (if they didn't sneak in somewhere in this one), some DSA, and some solid drivers, and things will be really looking good.

Overall though I suspect that ES3 is going to turn out to be the more significant recent release.

Share this post


Link to post
Share on other sites
CL/GL interop didn't fail to work. It did work quite well. Despite this I have to admit that it is way more complicated than the DX11 compute shaders, BUT it DID work. In fact I could port DX11 compute shaders to OpenCL and make it work together with OpenGL. see:
http://www.gamedev.n...via-opencl-r233
I'm looking forward to trying out OGL compute shaders though, as it seems more reasonable to use it for processing textures / lighting.
The debugging feature is quite an improvement as such functionality was missing. Edited by Yours3!f

Share this post


Link to post
Share on other sites
Computer Shaders - admission that OpenCL/GL interop has failed to work.
Features - brings it up to D3D11 standard but, with at least one extension, introduces yet another way to do things
DSA - genuine question about the state of it...
Note: no where did I say 'D3D11 is better!'


phantom is actually correct, although the wording chosen can certainly come across as "let's start a flame war" - but look beyond that at what the update actually does have to offer.
None of that is surprising, though. And indeed D3D11 is "better", except for the little detail that it's proprietary and Windows-only (which, as it happens, is the one important detail for me personally).

OpenGL is necessarily worse because it is designed by committee (ARB, Khronos, name it as you like). In addition to design by committee being always kind of troublesome, this particular committee has contained and contains members that have strong antipodal interests.

I won't say that Microsoft certainly has no interest in making OpenGL as good or better than their own product, because Microsoft is no longer involved (...at least officially). However, there's still Intel as a good example of an entity that is still officially involved.
Intel who already struggles supporting OpenGL 3.x on their Sandy/Ivy Bridge CPUs has a strong motivation not to add too many features too quickly. Promoting CPUs with integrated graphics is much harder if people have the impression that they don't support most modern features. Thus, advertising OpenGL and pushing its development forward lessens revenue.

Companies like AMD and nVidia on the other hand do have a (rather obvious) strong interest in pushing new features onto the market, because this allows them to sell new cards. But then again supporting both D3D and OpenGL means having twice as much driver development cost than actually necessary. If 90-95% of the software in their target market already uses D3D anyway, that's a bad deal. So again, even though there is some motivation, it is not necessarily overwhelming for OpenGL as such. If people buy the new nVidia 780 GTX Ultra because it supports D3D 12.1, which is needed to play Warcraft Ultimate, then that's just as good.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!