Jump to content
  • Advertisement
Sign in to follow this  
Joshua Olson

OpenGL I thought OpenGL was widely supported...

This topic is 2498 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

From PyGame's front page:

"With many people having broken OpenGL setups, requiring OpenGL exclusively will cut into your user base significantly. Pygame uses either opengl, directx, windib, X11, linux frame buffer, and many other different backends... including an ASCII art backend! OpenGL is often broken on linux systems, and also on windows systems - which is why professional games use multiple backends."

PyGame is often said to be a good choice for Python developers who want to support older hardware, but should I, as a newbie C++ game developer, be concerned that OpenGL isn't as widespread as I thought before reading this?

Share this post


Link to post
Share on other sites
Advertisement
Everything pretty much runs OpenGL. What exactly are you worried about? If you are aiming for a steam game? Everyone is going to have openGL in some version and you can verify their statistics page, which will probably be GL 2.0. All embedded/mobile stuff is running GL 1 to 2. That article suggests GL is broken on Linux (not sure why) but DirectX (the only alternative) doesn't even run on anything other than Windows.

Share this post


Link to post
Share on other sites
It depends on what your target hardware is, on the high end you got reasonably solid OpenGL support on all platforms, its the low end (Intel in particular) that is problematic, (If you stick with OpenGL 1.1 (2.0-2.1 is usually "supported" aswell but you have to be a bit more careful with what you do since the drivers are often quite bad) then Intel works just fine aswell, in general Intel works alot better with D3D).

On Linux there is another big problem though, AMD has insanely short support on their GPUs for Linux so once a card is a few years old it is very likely that users of it will have to resort to the opensource driver(if they don't want to stick with an old OS) and most of the time it simply doesn't get things working properly fast enough leaving a fairly large number of GPUs without a fully working driver. (Given Linux small marketshare however i wouldn't worry too much about that, Linux users who want to play games are likely to use an nvidia GPU anyway or upgrade their AMD GPU frequently enough to avoid problems)

Share this post


Link to post
Share on other sites
Well, I'll be mostly developing on a one-year-old laptop with 6 gb of ram and an i3 dual-core, which supports OpenGL 2.1. Not sure if that's mid or low end. What I want to program is a simulation and editor, which will be extremely simple but should still have as high a limit on world size/complexity as possible. If OpenGL is really a limiting factor, it might be better for me to use PyGame than pure C/C++. How does Linux do 3D without OpenGL? I thought Linux was a great OS for coding that sort of stuff?

Share this post


Link to post
Share on other sites
I don't know about the Linux side of things, but so far as Windows is concerned it's broadly true.

OpenGL support on Windows falls into one of three categories:

Integrated Intel graphics, which have buggy drivers and poor performance. This constitutes the vast majority of Windows installations, being business class desktops and laptops, and low-end "multimedia" PCs (high street chain store). These often come with OEM or Microsoft drivers, which frequently means no OpenGL support at all, unless the user updates their drivers from Intel (which requires a "have disk" install, and would be daunting - but not impossible - for the average non-technical user).

AMD/ATI. See the recent fiasco surrounding Rage for a pretty good example of the quality of their OpenGL support. in fairness, when they get it right they can be pretty good, but they seem to be in full-on reactive mode, waiting until the latest id Software title throws up a bunch of crashers before focussing on any kind of OpenGL driver QA.

NVIDIA. They're rock-solid but dangerous - NVIDIA drivers will often accept completely invalid OpenGL code (their shader compiler will even accept HLSL syntax) meaning that anything primarily developed/tested on NVIDIA is going to be suspect on other hardware.

Unfortunately it's been this way for a while. Fighting broken drivers under OpenGL is pretty normal.

Specific example.

Autodesk Inventor moved from OpenGL to Direct3D and a writeup explaining this move is available here: http://archicad-talk.graphisoft.com/files/autodesk_inventor_opengl_to_directx_evolution_788.pdf

It's worth reading the full thing, but here's a summary extract:
With Direct3D, our QA team can focus on testing _our_ code and finding defects in _our_ graphics code, instead of having to spend all their time just verifying that the graphics HW vendors have done their job correctly to produce an OpenGL graphics driver that actually works.[/quote]

Share this post


Link to post
Share on other sites
Thanks for the specifics. Let me be more specific, then. I won't need shaders. I might not even need textures (although that'll be nice to have in the distant future). I just want to draw lines and a polygon, and rotate and translate the world. That's it. Those core things shouldn't be buggy, right?

Share this post


Link to post
Share on other sites

Thanks for the specifics. Let me be more specific, then. I won't need shaders. I might not even need textures (although that'll be nice to have in the distant future). I just want to draw lines and a polygon, and rotate and translate the world. That's it. Those core things shouldn't be buggy, right?


Those things will work just fine.

Share this post


Link to post
Share on other sites
You can protect yourself by abstracting things. My library uses compile-time dependency injection to instantiate either a D3D9 or an OpenGL 3.3 template "facade". The facade creates and returns whichever object it was instantiated with. There are a limited number of classes you need to code (both versions have to have the same interface of course). The facade itself looks something like this:

[indent=1]
template<class VertexBuffer,
class IndexBuffer,
class ArrayBuffer,
class VertexShader,
class PixelShader,
class GeometryShader,
class ShaderProgram>
class GraphicsFacade
{
public:

typedef VertexBuffer VertexBufferType;

// Create and return a new vertex buffer.

std::shared_ptr<VertexBuffer> CreateVertexBuffer()
{
std::shared_ptr<VertexBuffer> buffer(new VertexBuffer);

return buffer;
}
};

Where the template parameters are either GL33 or D3D9 classes. I can create and use a vertex buffer as above like this:

std::shared_ptr<GraphicsFacade::VertexBufferType> MyVertexBuffer(graphicsFacade->CreateVertexBuffer());

and decide at compile time which version I want to use.

It's a little more involved than that but you get the idea. Also as others have said some drivers will support invalid code. I had this problem at work when my previously fine OpenGL app failed on an Intel Netbook. It turned out I was doing something invalid inside a display list and both NVIDIA and AMD cards were happy to run with it. The Intel driver threw an exception however! So yes, care must be taken with GL. The trick is to run it on as many different systems as possible.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!