OpenGL 4.3 - compute shaders and much more

Started by
26 comments, last by przemoli 11 years, 8 months ago
OpenGL 4.3 is here:
http://www.khronos.o...or-enhancements

specs:
http://www.opengl.org/registry/

Beta drivers from Nvidia:
http://www.nvidia.co...driver-4.3.html

And I obviously had to quickly try out compute shaders:
https://github.com/p...pute_shader.cpp

Yay! wub.png
Advertisement
Interesting!

For us lazy programmers, please give a short summary of the benefits or disadvantages, as you see it!
[size=2]Current project: Ephenation.
[size=2]Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/
OpenGL compute shaders... or 'opps, we got it wrong and MS got it right... quick back track!'

I skimmed a few other things; basically bringing the features up to D3D11 level and continuing the OpenGL tradition of 'here are 101 ways to do things... good luck with that!'

In fact could someone give me an update on the state of Direct State Access in OGL? 4.3 doesn't seem to have it as a feature and last I checked it was a case of some things and not all things...

For us lazy programmers, please give a short summary of the benefits or disadvantages, as you see it!

I think people that are more involved with all this can give more competent insights. g-truc has a nice review: http://www.g-truc.net/post-0494.html#menu

I literally just saw this hours ago and have still to go through all of it. I am mostly excited for compute shaders. The other additions I looked into seemed to be some obvious fixes ("layout(location =...)" for uniforms, imageSize etc.) as well as improvements to memory related aspects that play nice with compute shaders (ARB_shader_storage_buffer_object).
Not much to be excited about it seems, i think most of it has been available as extensions for ages as usual, only thing i find interesting is the texture parameter queries and shader storage buffers allthough i guess its just the old ext_texture_buffer in a new package, ES3 compatibility might be nice if we get some ES3 capable devices to play with.

It would be nice if moderators didn't try to start an API flamewar though, i think we get enough of those.

Edit: To phantom, i never said you weren't correct. i've replied in a PM to you instead to avoid derailing this thread.
[size="1"]I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!

It would be nice if moderators didn't try to start an API flamewar though, i think we get enough of those.


Really?

Tell me what was wrong with my statements?

Computer Shaders - admission that OpenCL/GL interop has failed to work.
Features - brings it up to D3D11 standard but, with at least one extension, introduces yet another way to do things
DSA - genuine question about the state of it...

Note: no where did I say 'D3D11 is better!' - all I did was call them out on areas they are still lacking - which is the API interface in general and MAYBE the state of DSA, which I asked about..

So, yeah, if not fawning over a new release of an incremental update to an outdated API is 'starting a flame' war then fine, I started a flame war...
phantom is actually correct, although the wording chosen can certainly come across as "let's start a flame war" - but look beyond that at what the update actually does have to offer.

All the same, I'm feeling moderately stoked about this update, despite there being a high percentage of "things that should have been done ages ago" in it. There are some genuine API usability improvements in there, and the push for standardised and patent-free texture compression formats seems well-intentioned at least. Let's have the ability to use FBOs with the default depth/stencil buffer in the next one (if they didn't sneak in somewhere in this one), some DSA, and some solid drivers, and things will be really looking good.

Overall though I suspect that ES3 is going to turn out to be the more significant recent release.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Still waiting for OpenGL 2 Lean & Mean, and Long Peaks...
-* So many things to do, so little time to spend. *-
CL/GL interop didn't fail to work. It did work quite well. Despite this I have to admit that it is way more complicated than the DX11 compute shaders, BUT it DID work. In fact I could port DX11 compute shaders to OpenCL and make it work together with OpenGL. see:
http://www.gamedev.n...via-opencl-r233
I'm looking forward to trying out OGL compute shaders though, as it seems more reasonable to use it for processing textures / lighting.
The debugging feature is quite an improvement as such functionality was missing.
Computer Shaders - admission that OpenCL/GL interop has failed to work.
Features - brings it up to D3D11 standard but, with at least one extension, introduces yet another way to do things
DSA - genuine question about the state of it...
Note: no where did I say 'D3D11 is better!'


phantom is actually correct, although the wording chosen can certainly come across as "let's start a flame war" - but look beyond that at what the update actually does have to offer.
None of that is surprising, though. And indeed D3D11 is "better", except for the little detail that it's proprietary and Windows-only (which, as it happens, is the one important detail for me personally).

OpenGL is necessarily worse because it is designed by committee (ARB, Khronos, name it as you like). In addition to design by committee being always kind of troublesome, this particular committee has contained and contains members that have strong antipodal interests.

I won't say that Microsoft certainly has no interest in making OpenGL as good or better than their own product, because Microsoft is no longer involved (...at least officially). However, there's still Intel as a good example of an entity that is still officially involved.
Intel who already struggles supporting OpenGL 3.x on their Sandy/Ivy Bridge CPUs has a strong motivation not to add too many features too quickly. Promoting CPUs with integrated graphics is much harder if people have the impression that they don't support most modern features. Thus, advertising OpenGL and pushing its development forward lessens revenue.

Companies like AMD and nVidia on the other hand do have a (rather obvious) strong interest in pushing new features onto the market, because this allows them to sell new cards. But then again supporting both D3D and OpenGL means having twice as much driver development cost than actually necessary. If 90-95% of the software in their target market already uses D3D anyway, that's a bad deal. So again, even though there is some motivation, it is not necessarily overwhelming for OpenGL as such. If people buy the new nVidia 780 GTX Ultra because it supports D3D 12.1, which is needed to play Warcraft Ultimate, then that's just as good.

This topic is closed to new replies.

Advertisement