Why I Hate Developing in OpenGL

Published May 09, 2007
Advertisement
I've decided that I'm pretty tired of repeating this all the time, so I may as well write it all down.

  1. The official spec doc is completely useless; you have to refer to the extension specs for any kind of useful information.
  2. The extensions thing is irritating, and you usually end up using several dozen by the time you're done doing anything vaguely serious.
  3. Everything's a freaking GLuint. I end up wrapping things up in classes for textures, VBs, IBs, etc in order to get real type safety. By the time I'm done, things pretty much look like D3D.
  4. Can't store vertex declarations in a convenient object, like you can in D3D. I usually end up writing vertex declarations myself and building a function that calls *Pointer functions appropriately.
  5. Every time you change VBO, your vertex setup is trashed and you have to call all the Pointer functions again.
  6. Binding to do everything means you can't even make simple assertions about the state of the pipeline between two draw calls. That is, when I'm getting ready to submit a batch, I cannot guarantee that the current VBO has not changed since the last draw call, because any MapBuffer, BufferData, etc calls in the middle probably included a BindBuffer.
  7. No way to compile a GLSL shader into a quickly loadable binary form.
  8. No coherent SDK type docs. If you want to find out about something, you figure out what it was named when it was an extension, and look it up.
  9. Lack of a tool suite comparable to D3D (debug runtime, PIX). There are a few passable tools for free, and then there's gDEBugger, if you can afford to shell out for it. There's just so much more that's readily available in D3D.
  10. Lots and lots of unnecessary cruft in the API. To be fair, D3D 9 is guilty too. Both D3D 10 and OGL 3 seek to solve this problem; the difference is that one has materialized as a product.
  11. FBO and antialiasing don't mix. (Ok, so this is actually Ysaneya's complaint, but I'm willing to take his word on it.) There's a new extension for this, but only the GeForce 8 series exposes it.
  12. GLSL can't be used on hardware that predates shader 2.0. This is getting less important as time goes on, but it's an irritating limit in the meantime.
  13. Developing GLSL on an NVIDIA card is a pain, because they simply run it through a slightly modified Cg frontend. Long story short, a lot of illegal GLSL code will pass through on NVIDIA hardware, whereas ATI will flag it appropriately.
  14. There are some bizarre cases of strictness in GLSL. For example, there's no implicit casting of literal integers to floats/doubles. So a 1.0 will compile, but 1 will break.
  15. You can't really query if all GLSL functionality is available or not. The singular example is the noise() function. Very nearly nobody implements it, choosing instead to return a constant (usually black). You can't detect this failure, at all.
  16. Lack of a D3DX equivalent. Math, image loading, etc. Getting a simple OpenGL application working without completely reinventing the wheel requires tapping about a half dozen libraries.
  17. Related to the above, there's no D3DX Effect style functionality. If you've ever had the misfortune of working with CgFX, you know it's not really a great option.
  18. You can't change the multisample mode without destroying the entire render window outright.
  19. You can't create an FBO that is depth and stencil without another depth_stencil extension. That extension exists in NV and EXT forms, but no non-NVIDIA cards currently make it available.
0 likes 8 comments

Comments

Ravuya
#2 I mostly get around by using GLee; then I just have to check to see if the extension is there, without having to screw around with loading it myself.

#3 isn't a problem I've ever encountered, really, probably because my first instinct was to wrap it.

#4 agreed.
May 09, 2007 04:31 PM
JTippetts
So, switch to D3D. Problem solved.
May 09, 2007 05:15 PM
Promit
Quote:Original post by JTippetts
So, switch to D3D. Problem solved.
Well, obviously. The point is to explain WHY I switched (which happened around Nov 05), and why I'm so heavily cynical when I talk about OpenGL.
May 09, 2007 05:32 PM
_the_phantom_
1. It's only useless if you are looking up extension information... no surprise there.
2. If you are refering to 'no beyond 1.1 functionality' then yes, that is bothersome (less so with GLee) but thats a problem which can be put at MS's door.
3. Its a C library.
4, 5, 6 - Fixed in Longs Peak, infact I'd argue the VAO object system is better than the D3D equvliant
7. Granted, probably fixed 'soon' however with the merge with Khronos
8. Not sure I grok this one; if I want to find out about a function I either (a) look it up in the red book or (b) bang the function name into google, first hit normally returns what I need
9. Yes, this does suck
10. June is all I have to say about Longs Peak; Unlike the ill fated 2.0 spec this one has had alot more community input, exposure and more coherance and cooperation than was around when 3DLabs tried to push GL2.0.
11. Not really an OGL problem imo, more a case of vendors not bothering to impliment it (the equlivant moan about D3D would be not properly enforcing VTX so that ATI could produce an SM3.0 compliant card without that feature included as working)
12. Screaw pre-2.0 hardware [razz]
13. Again, a vendor problem not an OGL one as such
14. iirc there are no implicit casts at all, makes life easier for the compiler afaik
15. For core-GLSL functionality, yes this can be bothersome... however GLSL can also be extended (glDrawBuffers for MRT output for example) which is flagable
16. On ARB todo list, probably should have been introduced sooner (but then, most people will probably still role their own due to 'not made here' syndrome)
17. I think this is being rolled in with Collada or something like that... I know there are plans, again due to the Khronos merging thing
18. Crippled Window managers; personally I can't see a good reason why Windows won't let you destory a dc and create a new one with a new setup but apprently it just wont *shrugs*
19. This one is wrong for definate [grin] ATI's Vista OpenGL Driver, which is somewhat better than there XP driver infact, has this extension available and it appears to work just fine (although I need to do more testing on it, I just lack the time until later this month)

I'd just like to point out the idea wasn't to 'disprove' your problems, more to form a counter point to them [smile]
May 10, 2007 12:31 AM
Evil Steve
I've not really used OpenGL at all, but the main things that put me off was #16 (No D3DX), and the whole extensions mess - which to me seems so much more confusing than the D3D caps method.
May 10, 2007 03:54 AM
jollyjeffers
Quote:7. No way to compile a GLSL shader into a quickly loadable binary form.

12. GLSL can't be used on hardware that predates shader 2.0. This is getting less important as time goes on, but it's an irritating limit in the meantime.

13. Developing GLSL on an NVIDIA card is a pain, because they simply run it through a slightly modified Cg frontend. Long story short, a lot of illegal GLSL code will pass through on NVIDIA hardware, whereas ATI will flag it appropriately.

14. There are some bizarre cases of strictness in GLSL. For example, there's no implicit casting of literal integers to floats/doubles. So a 1.0 will compile, but 1 will break.

15. You can't really query if all GLSL functionality is available or not. The singular example is the noise() function. Very nearly nobody implements it, choosing instead to return a constant (usually black). You can't detect this failure, at all.
[wow] Serious on all these?!

I suppose #12 and #13 have similarities with D3D9, but woah - I thought GLSL was an equal to, if not better than, HLSL! #15 alone would be enough to have me running for the hills never to touch it again - how the hell did any sane ARB/Khronos/IHV screw that up!?

Jack
May 10, 2007 05:36 AM
Promit
Time for my counter-counter-point [grin]
Quote:Original post by phantom
3. Its a C library.
Not an excuse. So is Win32, but you can't use an HWND as a thread without an explicit cast. In fact, I'd advocate a model similar to Win32, where dummy struct pointers are used to fabricate type safety, without significantly altering the API setup.
Quote:4, 5, 6 - Fixed in Longs Peak, infact I'd argue the VAO object system is better than the D3D equvliant
I bet it is. I have basically two different objections here. One is that, in light of the castration of OpenGL 2.0, I'll believe it when I see it. Two is that Longs Peak is god only knows how far out. D3D 10 includes a lot of rough limitations as far as hardware and OS, but at least it exists in a material form.
Quote:10. June is all I have to say about Longs Peak; Unlike the ill fated 2.0 spec this one has had alot more community input, exposure and more coherance and cooperation than was around when 3DLabs tried to push GL2.0.
That's good to hear. Maybe I can figure out more when I start at NVIDIA.
Quote:19. This one is wrong for definate [grin] ATI's Vista OpenGL Driver, which is somewhat better than there XP driver infact, has this extension available and it appears to work just fine (although I need to do more testing on it, I just lack the time until later this month)
I'll have to take your word for it. I cross checked on Delphi3D's extension database and this was the report I got.

The rest of the problems can be basically summed up in two categories:

* It's the vendor's fault. Not a false point, but as a developer, I'm not particularly interested in who broke it. It's broken, and that's irritating. The three major companies involved (MS, NVIDIA, ATI) are all at fault in various places -- and I've been on the payroll of one and am about to join the payroll of another.

* It's gonna be fixed soon/eventually. Half of these, we don't even know when "eventually" is going to happen. Half are linked to OGL 3, and again, I'll believe it when I see it.

I don't know that any of these problems on their own are breaking points (although the lack of effect-style stuff is a real frustration). All together though, they make developing in OpenGL an unpleasant experience, for me. (And I modified the title of the post slightly to reflect this.)
May 10, 2007 12:42 PM
Drilian
Quote:Original post by phantom
14. iirc there are no implicit casts at all, makes life easier for the compiler afaik


I find this to be a rather bad argument. For a compiler, once it's written, nothing is "hard." Difficulty doesn't apply. Sure, it's a TAD harder to write something that interprets "10" as "10.0" or "10.0f", but come ON. It's ridiculous to have to add the decimal place JUST BECAUSE THE COMPILER WRITER DIDN'T FEEL LIKE SUPPORTING IT. That's crap.

I'm much more a fan of frontloading complexity for the developer in order to reduce complexity for the user (as long as there is no decrease in functionality).

Being able to type

float x=1;

instead of having to type
float x=1.0;

may seem like a tiny complaint, but most people are used to being able to do exactly that in many EXISTING programming languages.

Screw making it easier for the compiler. Make it easier for the developers.
May 10, 2007 02:02 PM
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Advertisement