Did OpenGL miss a golden opportunity against DirectX?

Started by
108 comments, last by Washu 15 years, 9 months ago
I am no expert on OpenGL, but I took a look over at their site and saw that the newest GLSL spec document is from 07-Sept-2006 (http://www.opengl.org/documentation/glsl/). And that tells me, that GLSL has not been updated to match HLSL 4, and that means that OpenGL missed a golden opportunity in providing an alternative for game developers (and thereafter gamers at some point) to switch to Windows Vista to provide graphics for the newest graphics cards with DirectX 10. Microsoft bound DirectX to the new Vista platform, which has flopped. By doing that, they have cut off the newest version of DirectX from the mainstream gamers. If OpenGL had siezed this opportunity and had provided functionality in GLSL to match HLSL 4, then they could have made an alternative game platform so strong, that developers and gamers wouldn't have been forced to change to Vista to take advantage of the newest graphics cards. A shame really. Maybe it could have taught MS to stop trying to force it's audience to switch to their new windows platform, simply by binding DirectX to it. I don't like being taken hostage by a company, just because they want to force me to change a perfectly good OS, so they can make more money. Too bad OpenGL hasn't been stronger and more on the beat, so they could have provided a serious alternative :( (of course I might be totally wrong, since I haven't investigated OpenGL beyond looking at the release date of the latest GLSL - which is almost 2 years old. If Im wrong about GLSL, please enlighten me).
Quote:CalvinI am only polite because I don't know enough foul languageQuote:Original post by superpigI think the reason your rating has dropped so much, Mercenarey, is that you come across as an arrogant asshole.
Advertisement
You do know that OpenGL supports extensions? (http://developer.download.nvidia.com/opengl/specs/g80specs.pdf)
This is a topic that has been beaten to death here on multiple occasions, and is guaranteed to end up as one step above a shouting match (no doubt accelerated by some of the inflammatory language used in the OP).
This is an NVidia-specific extension. That would bomb game development back to the days of supporting individual hardware pieces, before plug-and-play.

Of course, we are in a situation where there are only two big card companies, but who knows - that may change (Intel is in the process of making their graphics stronger, for instance). No matter the amount of cards, it is on principle the wrong direction to go, to HAVE to support individual cards IMO.
Quote:CalvinI am only polite because I don't know enough foul languageQuote:Original post by superpigI think the reason your rating has dropped so much, Mercenarey, is that you come across as an arrogant asshole.
To expand on l0calh05t's comments -- OpenGL extensions are provided by the chipset/driver manufacturer as features are released in the hardware. The DirectX / OpenGL "compatible" stamp on the card simply means that a particular chipset/driver combo supports a minimum set of features.

In other words, there is nothing "special" about DirectX. It is the card itself that provides these features. In other words, if the card provides it, and DirectX supports it, then chances are 100% that the OpenGL driver supplied by the manufacturer will support it as well.

In fact, things work in the opposite direction than you suggest... As new features are added, the OpenGL extensions are released immediately (with the card). On the other hand, DirectX has to go through a version iteration before a feature is supported. Not that this really matters though, since the hardware manufacturers generally seem to release new hardware features at the same time as a new DirectX version, ex: decent shaders came with the 6800, which was released roughly at the same time as DirectX 9.0c. If 9.0c hadn't been released at that time, then DirectX would have automatically lagged behind OpenGL, since OpenGL would have had the new shader model built in by default (by way of extensions). Of course, OpenGL 2.0 came out around the same time too, in order to provide a standard for high level shaders. It's really a useless argument.
Quote:Original post by Mercenarey
This is an NVidia-specific extension. That would bomb game development back to the days of supporting individual hardware pieces, before plug-and-play.

Of course, we are in a situation where there are only two big cards, but who knows - that may change (Intel is in the process of making their graphics stronger, for instance). No matter the amount of cards, it is on principle the wrong direction to go, to HAVE to support individual cards IMO.


Truth be told, game developers use card-specific extensions as much as possible when there are performance benefits involved. It's called a "code path".

And if there is a hardware feature that one manufacturer provides, but the other does not, then really, how can you fault the one who provides it? That's a ridiculous argument considering that your original point was about progress.
Quote:Original post by taby
Quote:Original post by Mercenarey
This is an NVidia-specific extension. That would bomb game development back to the days of supporting individual hardware pieces, before plug-and-play.

Of course, we are in a situation where there are only two big cards, but who knows - that may change (Intel is in the process of making their graphics stronger, for instance). No matter the amount of cards, it is on principle the wrong direction to go, to HAVE to support individual cards IMO.


Truth be told, game developers use card-specific extensions as much as possible when there are performance benefits involved. It's called a "code path".

And if there is a hardware feature that one manufacturer provides, but the other does not, then really, how can you fault the one who provides it? That's a ridiculous argument considering that your original point was about progress.


But the problem is at a another level than that: That of standardization. We have already been through the problems of no standardizations. I still remember the 90's, where I had to pick my sound card driver from a list, and thereafter I would have to maybe set the IRQ and port (or something like that, it has been a few years, hehe).
This is the same problem of a kind. Now you have to support for individual cards with an extension. Who knows, in 2 years a new company comes on the market (or Intel grows strong on graphics), and you couldn't support the new card in your old game, and now new players, or players that got new computers, can't play it, until you come out with a patch (which may or may not happen, maybe you don't support the game anymore?).

There are so many problems with this approach, and we have already been through this evolution in computers once - I will be a sad panda if we are forced back in time on this front.

[Edited by - Mercenarey on July 15, 2008 10:29:22 AM]
Quote:CalvinI am only polite because I don't know enough foul languageQuote:Original post by superpigI think the reason your rating has dropped so much, Mercenarey, is that you come across as an arrogant asshole.
Quote:Original post by Mercenarey
This is an NVidia-specific extension. That would bomb game development back to the days of supporting individual hardware pieces, before plug-and-play.


Actually there's NV extensions in that document, but the most important ones aren't, they're EXT extensions, which are multivendor, not specific to nvidia.

If ATI has failed to provide a GL implementation which support those EXT extensions, that doesn't make the extensions vendor specific.

FTR, I'm happily using GL_EXT_gpu_shader4 in OpenGL to do bitwise operations for noise in a GLSL shader under Linux :)
Quote:Original post by HuntsMan
Quote:Original post by Mercenarey
This is an NVidia-specific extension. That would bomb game development back to the days of supporting individual hardware pieces, before plug-and-play.


Actually there's NV extensions in that document, but the most important ones aren't, they're EXT extensions, which are multivendor, not specific to nvidia.

If ATI has failed to provide a GL implementation which support those EXT extensions, that doesn't make the extensions vendor specific.

FTR, I'm happily using GL_EXT_gpu_shader4 in OpenGL to do bitwise operations for noise in a GLSL shader under Linux :)


So who is in charge of this multi-vendor extension? Is that an independent part?
If that is the case, then I would be pretty satisfied.
Quote:CalvinI am only polite because I don't know enough foul languageQuote:Original post by superpigI think the reason your rating has dropped so much, Mercenarey, is that you come across as an arrogant asshole.
Quote:Original post by taby
Truth be told, game developers use card-specific extensions as much as possible when there are performance benefits involved. It's called a "code path".
Like hell they do. Most PC game developers are using DX now, and Carmack was the only guy who was ever really talking about vendor specific codepaths in the first place. I don't think ATI's R2VB support sees much use at all, and there's very, very little in the way of vendor specific extensions these days, specifically because game developers want nothing to do with that bs.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

This topic is closed to new replies.

Advertisement