I've started studying high-dynamic range rendering to implement it in my engine. I've been fighting against OpenGL and its extensions mechanism for many hours.
So here's the time for the OpenGL rant.
First, i consider myself as an OpenGL expert. I develop vertex and fragment shaders daily during my job. I've been using a lot of advanced extensions for years - so i'm definately not a newbie. And OpenGL has been my API of choice in the last 5 years, but it's getting really annoying. The amount of extensions is always growing, to maintain backwards compatibility. This means that today, you've got up to a hundred extensions listed in the extension string, both on NVidia and ATI cards. Most of these extensions are in two or three versions, some with subtle differences and restrictions.
While implementing floating point pbuffers today, i discovered that you need to create a floating point texture on the GL_TEXTURE_RECTANGLE target only.. even if it's dimension is square. That's right: using GL_TEXTURE_2D will generate a nice GL error.
So let's see how it affects my engine now: i've got a nice, generic interface, to generate a renderable buffer. One of the constructor arguments is the format for this buffer: it can be a standard unsigned byte RGBA buffer, or a floating point RGBA buffer. Now, depending on the case, it will internally either generate a 2D texture, or a RECTANGLE texture.
But extensions are soooo wonderful in OpenGL, that the texture coordinates are not handled the same way for 2D or for RECT textures: texture coordinates are normalized for 2D ones, but non-normalized for rectangular textures.
As a result, the user has to test if OpenGL is using a 2D or a rectangular texture, and must adjust his coordinates. While in theory, everything was in place to be coherent and hidden from the user.
PBuffers are close to a nightmare to use, and the concept of context switching as well as texture copy, are reducing performance for nothing. That's why i'd like to use fragment buffer objects, but they are not supported on ATI cards yet. Arg.
OpenGL has become a big mess, and i'm moderating my words.
Sorry for the rant.