directx vs openGl

Started by
16 comments, last by Hodgman 6 years, 10 months ago

It is strictly insane to force OpenGL onto a console because there hardware is made for much more specific use than OpenGL is and vendor SDKs should know there hardware best

Advertisement
Another point is that having a cross-platform graphics API is not actually as important as people think it is.

Graphics is after all only a small part of a game engine. A cross-platform graphics API won't get you sound, input, networking, memory management, file access, etc. Just using OpenGL won't magically make a game engine be cross-platform; you still have to deal with all of these other areas (and even something like SDL has platform-specific quirks once you go beyond trivial programs).

If you're targetting multiple platforms you're probably already using multiple graphics APIs anyway. If you're targetting PC, XBox and PS you're already using 3. Having to support multiple graphics APIs is a solved problem, so rather than ask "why wouldn't you use OpenGL?" you should be asking "why wouldn't you use the best API for each platform?" That's a much more interesting and useful question.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

as i searched i found some interesting info on wikipedia:

The PlayStation 4 features two graphics APIs, a low level API named GNM and a high level API named GNMX. Most people start with the GNMX API which wraps around GNM and manages the more esoteric GPU details in a way that's a lot more familiar if users are used to platforms like Direct3D 11. The developers of The Crew put a lot of work into the move to the lower-level GNM, and in the process the tech team found out just how much work DirectX does in the background in terms of memory allocation and resource management.[8]

Another key area of the game is its programmable pixel shaders.[8] Sony's own PlayStation Shader Language (PSSL) was introduced on the PlayStation 4.[9] It has been suggested[by whom?] that the PlayStation Shader Language is very similar indeed to the HLSL standard in DirectX 11, with just subtle differences that were eliminated for the most part through preprocessor macros

Don't let GL's "portability" argument fool you.

I have to respectfully disagree here. Every device with a GPU supports OpenGL. DX is only supported on Windows and XBox. OpenGL wins portability by a country mile.

Most big games ship for XBox, Windows and Playstation - D3D works on two of those and GL works on just one :lol:
A FOSS developer might support Windows, MacOS and Linux, in which case GL is a must :wink:
Different developers will find GL mandatory or useless. You'll find a few blogs on the net that say that the Wii or the PS3, etc, support GL, or I've even seen one that said that the PS4 supports D3D... They don't. Consoles almost always have a custom API designed specifically for them.

The argument I was making though wasnt about OS/Platform compatibility, but hardware compatibility.
OpenGL is just a text document, not a library. AMD, NVidia, Intel, etc all create libraries that (hopefully) behave according to that official text document. Any non-trivial game will discover that this isn't true, and wil have to debug their game on three desktop GPUs to ensure that it's actually portable. The situation is much worse for mobile developers, who usually have a library of dozens of devices plugged into a CI server in order to continually test their game on as much hardware as possible. Even if two different phones use the same GPU, they will often use different GL libraries if they come from different manufacturers :o Writing GL|ES code for a broad range of mobile devices is hell.
That's why you have to beware of GL's portability argument -- the spec is portable in theory, but in practice every implementation of the spec has a few unique behaviors. D3D wins hardware portability by a country mile, while GL / GL|ES / WebGL together have massive platform portability.

D3D solves this by having a single implementation of the core that runs on all hardware, and then they actually test/validate/certify implementations of the HW specific parts. GL's central body doesn't do anything to keep their driver authors in line like this. If Khronos actually did the same thing as MS here, then GL could be king. Maybe with Vulkan they'll take a few steps forward this time...
So, while D3D only works on Windows, it works exactly the same on NV/AMD/Intel... Whereas in GL it's all too easy to accidentally make a game that only works on NVidia hardware... Personally for stability and ease of development, I would use Metal on Mac/iOS, D3D on Windows and console-APIs on consoles, which leaves GL for Linux, GL|ES for Android, and WebGL for web... and those are the three platforms that gamedevs hate to support :lol:

Another concern for console developers is that the Xbox and Playstation shading languages are very close to Windows HLSL (close enough that you can hide them with a few defines). We've managed to compile a massive shader codebase for Windows D3D9, Windows D3D11, Xbox360, XboxOne, PS3 and PS4... But compiling it for GL would've meant rewriting it, or using a HLSL->GLSL translator. The business people's projected sales for Windows were low enough for us to have to push really hard to get them to even release a Steam version, and Mac/Linux projections were like 1% of Windows sales so there's no way they were going to let us do a GL port :lol:
Vulkan is also a step forward here, as I'm going to be able to simply compile my HLSL code-base to SPIR-V.

Part of the problem with OpenGL is that design by committee just doesn't work.

Let's say there's a new feature that is proposed for a new version of the spec. Vendors A and B would like to get it in, but vendor C's hardware doesn't support it.

Option 1 is that it goes in; vendor C are now in a position where they can't claim support for the new version but vendors A and B are happy.

Option 2 is that it doesn't go in; vendor C can now claim support for the new version but vendors A and B aren't happy.

Or we could do it the OpenGL way and massage the spec a little to allow vendor C to claim support for it but in a way that allows them to not actually support it, thereby keeping all of the vendors happy. But now of course the poor programmer has to be aware of edge cases such as this.

The end result is that in the absence of a central controlling body putting a stop to this kind of nonsense, vendor-specific behaviour bubbles up into the core specification (let's look for how many times the term "implementation dependent" occurs in the GL spec) and the programmers and end-users are the ones who take the hit.

And just in case you're interested - the example I gave wasn't a contrived example to make OpenGL look bad: it actually happened.

How to try and get this into the core? Seems too small to do as an optional subset. Called a straw poll contingent on someone coming up with a "caps-bit like" interface that would let Intel claim to support it. Rob then suggested that we could change the spec to allow supporting counters with zero bits and calling out in the spec that query functionality should not be used in this case.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

let me ask another question. there are some other available API,s like mantle and vulkan. what about them? are there replacements or additive stuff? is opengl being deprecated because of vulkan? is coding much the same or compeletly different? its said mantle has less overhead than both opengl and directx.

can you give some explanation?

as i searched a bit, mantle is implemented by AMD but after that AMD has helped kronos to make vulkan as a low overhead API with more controll over graphic card. its said API like GNM and GNMX and vulkan has coding style more like dx rather than opengl and vulkan supports many platforms like win lin and android. for some one in the middle of way of learning graphics proggraming, it seems better to concentrate more on directx. am i right?

Microsoft had said that D3D11 was "mature" and was hinting that it could be the final version of the API. Khronos were similarly hinting about OpenGL4... But gamedevs wanted a PC API similar to the console API's (e.g. GCM, GNM), so AMD created Mantle as an example of what we could have, and then gave it to Khronos for free to force them to act (if you don't make a new API, we will replace you)... and they did, adopting Mantle and rebranding it as Vulkan faster than any of their previous API designs. Meanwhile, MS needed a console style API for the XboxOne, so AMD worked closely with them in the design of Direct3D12, which is very similar to Vulkan/Mantle.

So now we have:
D3D9 / GL2 -- legacy crap.
D3D11-feature-level-10 / GL3 -- legacy.
D3D11 / GL4 -- current "high level" APIs.
D3D12 / Vulkan -- current "low level" APIs.

D3D11 / GL will still stick around because they're EXTREMELY easier to use than the new low level APIs.
D3D12 / Vulkan move a lot of responsibilities out of the driver and into your game engine, so to use these APIs, you basically have to be able to write a GPU driver too :o

Khronos have hinted that they will develop a GL5 and a Vulkan2 in the future, as they're targeted at a different set of users each.

This topic is closed to new replies.

Advertisement