What OpenGL Implementation Do Real Games Use?

Started by
13 comments, last by Brother Bob 11 years, 2 months ago

Since there are many official (and unofficial) desktop OpenGL implementations, which do actual 3D games use? GLUT? FreeGLUT? Mesa3D?

Which performs the best?

C dominates the world of linear procedural computing, which won't advance. The future lies in MASSIVE parallelism.

Advertisement

Neither GLUT nor FreeGLUT are OpenGL implementations. They are windowing layers for OpenGL. Mesa3D is a software implementation that provides limited possibilities for hardware acceleration.

If you're on Windows, there's pretty much only one sensible option; you use the default implementation that the operating system provides. The benefit with using that implementation is that each hardware manufacturer provides their own hardware driver for it. Just link the standard opengl32.lib that ships with your compiler to use it.

Mesa3D is the primary implementation of OpenGL on UNIX and supports several hardware graphics adapters (More than the default implementation in Windows infact).

But, yes it can also be compiled to be software only.
http://tinyurl.com/shewonyay - Thanks so much for those who voted on my GF's Competition Cosplay Entry for Cosplayzine. She won! I owe you all beers :)

Mutiny - Open-source C++ Unity re-implementation.
Defile of Eden 2 - FreeBSD and OpenBSD binaries of our latest game.

So is Mesa 3D the best choice? Also, Brother Bob mentioned opengl32.lib. What is the header file for it?

C dominates the world of linear procedural computing, which won't advance. The future lies in MASSIVE parallelism.

The standard <GL/gl.h>.

Note that opengl32.lib only supports up to opengl 1.1. If you want to use any newer functionality, you need to get that functionality from the graphics card drivers. Your best option is to use a library like glew(http://glew.sourceforge.net/) to load that functionality for you.

Define what an “actual” 3D game is?
Low-level indie games could use anything. Many people use wrappers such as SDL or SMFL, but this is mainly just for helping them through the learning process or to get quick but not-so-serious results.

Medium-level indie games get closer to the direct API of choice, but it is not consistent enough to say what they “commonly” use, and at this point the target platform becomes much more of a decision-maker. Of course the mobile industry is booming so it is worth mentioning that for mobile platforms they will all be using raw OpenGL ES 2.0.
But at this level those who are developing for Windows start to lean more towards DirectX and start to grow their own cross-platform engine (assuming you are not interesting in those who are using Unity 3D, Unreal Engine, etc., since you seem to want to get hands-on with your work).
At this level it is not always feasible in terms of skill or finances to make a DirectX port of an existing OpenGL engine, but even those who stick to OpenGL start to tend more towards raw OpenGL (no wrappers, just raw OpenGL).

At the AAA end of the scale things become more consistent but there is still no single answer.
By this point OpenGL is rarely used at all except for OpenGL ES 2.0 for mobiles. Consoles and hand-helds (such as Nintendo 3DS) often provide an OpenGL (or OpenGL ES 2.0) layer but developers avoid this for performance reasons—it is always faster to use the native API.
That carries over to PC, in which the native API is DirectX. As a result, most “actual” games (you didn’t define it so I can only assume what you meant) for the desktop market use DirectX when possible and OpenGL when no other options are available, and they strictly use raw OpenGL.
Generally the big game developers prefer to avoid OpenGL altogether if possible because it is like developing for Android—there are too many inconsistent implementations across vendors and the drivers are usually shoddy. What works on one machine is guaranteed not to work on some other machine out there.
Another reason is that with the expectations on today’s graphics, they will require OpenGL 4.3, which requires users of Windows to upgrade manually if they have not already on Windows.

Valve is trying to put an end to this situation, and we may well start to see much better drivers (which means performance) and more consistent results in the future.


OpenGL is worth learning for 2 reasons:
  • There may be a surge in OpenGL games if Valve is successful in its Linux pursuit.
  • The mobile industry is booming and is a great place to start making your own indie games.
But if we assume that by “actual” you meant “AAA”, while there are always exceptions, the main answer is that they are using DirectX 11 first, then DirectX 9, then raw OpenGL if targeting Linux or Macintosh. Generally speaking.

And which implementation? I think you meant to answer which version. You don’t get to pick your implementation—that is up to the vendors to implement.
The version you want is up to you. Lower versions work across more machines, but your graphics will be pretty poor. If you want compute shaders you will need core version 4.3 or GL_ARB_compute_shader extension. If you use extensions, prepare for headaches as you implement all the fall-backs for unsupported features. One more reason why the big guys stay away from OpenGL when possible.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

As far as I know it is like this:

In *nix world you get: Mesa (up to 3.1 spec), proprietary GPU driver's implementation (up to 4.3 for nVidia, 4.2 for AMD) and OSS driver's implementation (i have no idea, enough to play Quake3 based games).

In Windows you get: Microsoft implementation (1.1 spec) and GPU driver's implementation (up to 4.3 for nVidia, 4.2 for AMD).

In OSX you get: Apple's implementation (up to 3.2 for everyone).

EDIT: Corrected Mesa's and Apple's spec implementations.

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

My journals: dustArtemis ECS framework and Making a Terrain Generator

In OSX you get: Apple's implementation (up to 3.1 for everyone).

OS X 10.8 (Mountain Lion) supports OpenGL 3.2 core with some OpenGL 3.3 features as extensions.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

Just to clarify to MrJoshL, we don't really "choose" what implementation to use (unless we want to force software rendering and we explicitly do so)

If we want hardware acceleration (aka get access to GPU), we'll just load the OpenGL implementation that is installed in the system (for NVIDIA cards, it's NVIDIA's, for ATI cards, it's ATI's). Since they're implementations, they implement everything they're supposed to, otherwise we would get a crash or a "cannot load routine from library" error and exit.

In Windows, the OGL system is called OpenGL "ICD" (Installable Client Driver). When the driver (ati, nvidia, intel, sis, s3, powervr, 3dfx, etc) didn't provide an OpenGL implementation, the application will be routed to a software implementation developed by Microsoft which is very outdated (supports 1.1 spec) so if you're using something higher, your application it will just fail to load (it's as if DirectX wouldn't be installed for Direct3D games)

When the driver did provide the implementation, the ICD will route to the driver's DLL.

In Linux, something very similar happens. Most distros ship the Mesa software implementation (which is usually very up to date), and if you install proprietary drivers, the installation messes with distro's folders & symbolic links to use the driver's OGL implementation instead of Mesa's.

Every now and then the installer (either driver's or distro's package manager) may mess the installation and try to mix Mesa dlls with driver's and X11 will crash when launching a GL application (been there....... multiple times). The situation has improved a lot though, in the last couple of years.

In Mac, I have no idea how it works, but afaik Apple controls the implementation being shipped.

You can of course ship your game with Mesa DLLs (since it's the only implementation I'm aware of that could be licensed for that) and always use Mesa's implementation, but almost nobody would like to do that.

GLUT & FreeGLUT are layers that simplify the creation of a GL context, and may deal with all this trouble (i.e. not having the right ICD installed, not having required GL version, loading extensions, etc) because this is all messing with DLL & function loading that has nothing to do with rendering triangles to the screen.

Edit: We just want to load the installed implementation and start rendering with hardware acceleration.

This topic is closed to new replies.

Advertisement