DirectX vs OpenGL ?

Started by
39 comments, last by 21st Century Moose 12 years, 8 months ago

There are no "open source extensions". OpenGL has absolutely nothing to do with open source.
[/quote]

I thought of SDL, GLFW and so on. Is "many open source libs" better understandable?
Advertisement
D3D is not 'more difficult to understand' either... if anything OpenGL is the harder as there are no central docs, many many out dated tutorials and hitting 'the fast path' is pretty damned hard unless you know what you are doing.

D3D doesn't have a 'slow path' for things like submission of data etc so it's easier to produce things which perform well.

[quote name='shdpl' timestamp='1313878028' post='4851735']
[quote name='Gl_Terminator' timestamp='1313512841' post='4849912']
Use directX i was an OpenGL fan but finally i gave up, directX has a lot more support, and complex stuff are handled more easy,

Could you be more specific please?
[/quote]

heheh has you even tried to enable full screen anti-aliasing with openGl, or tried to draw 2D content, or use VBO, or better has you even tried to make your own GSL script. dude I am telling you OpenGL at the end is more difficult than DX and I find that out after making my own game in opengl and then porting it to DX.
[/quote]

glEnable(GL_MULTISAMPLE) to enable full screen anti-aliasing


in Direct3D you have to use vertex buffers too

GLSL isn't any harder/easier than HLSL




OpenGL fanboy.
GLSL isn't any harder/easier than HLSL

The shading languages languages themselves are quite equivalent, but the infrastructure you need to build in your program to use them is somewhat more involved with OpenGL. D3D can be just one call to D3DXCreateEffectFromFile and you're ready to start drawing, compared to OpenGL's load (and you must write the loader yourself), compile, attach, link, validate thing. Some kind of saner shader-management library is badly wanting for OpenGL (and let's not make it GPL as proprietary programs may want to use it too).

Another downside is that with OpenGL each driver writer must provide their own shader compiler, whereas with D3D there is a single shader compiler provided by Microsoft. That greatly enhances the consistency and robustness of compiled HLSL shaders. No driver vendor can screw up (or put in dubious optimizations), everyone's shaders get compiled the same way by the same compiler, and the world is a happier place.

It also helps that HLSL has been quite stable for longer periods of time, giving it a good chance for bugs to shake out through wide usage. SM3 HLSL is utterly rock-solid for example. GLSL by contrast has been a bit of a moving target recently, with many upgrades, incompatibilities between versions, and more exciting ways to shoot yourself in the foot. Not a problem if you're only running on your own hardware, you just code to what your hardware can and can't do. But as soon as you need to run on other people's hardware, and if those other people are in different countries so you can't get at their machines for a debugging session, you really do appreciate the value of stability and predictability.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

The shading languages languages themselves are quite equivalent, but the infrastructure you need to build in your program to use them is somewhat more involved with OpenGL. D3D can be just one call to D3DXCreateEffectFromFile and you're ready to start drawing, compared to OpenGL's load (and you must write the loader yourself), compile, attach, link, validate thing. Some kind of saner shader-management library is badly wanting for OpenGL (and let's not make it GPL as proprietary programs may want to use it too).
As a pro/con argument, this just boils down to "D3D has the D3DX utility library, while GL has no official utility library". This has quite a big impact on usability on a smaller scale (e.g. when learning), but less of an impact as you scale up to well-manned projects. Don't get me wrong though, it's always nice to have utility libraries available!

The de-facto equivalent of D3DX-Effects for GL is CgFX. These two standards are close enough that you can write a single file that works under both systems (D3DX on Microsoft, and CgFX on non-Microsoft platforms).
Another downside is that with OpenGL each driver writer must provide their own shader compiler, whereas with D3D there is a single shader compiler provided by Microsoft. That greatly enhances the consistency and robustness of compiled HLSL shaders. No driver vendor can screw up (or put in dubious optimizations), everyone's shaders get compiled the same way by the same compiler, and the world is a happier place[/quote]Yeah I've been burnt by this a lot (I'm looking at you, nVidia...) -- where an invalid shader compiles without error under one driver, but fails under a stricter driver, which just leads to works on my machine syndrome...

Too much to quote it all.

I understand how it works. As I said, I will do more serious (real) benchmarking later, but the results will not show OpenGL as the winner.
I have models purchased from Turbo Squid for hundreds of dollars with several millions of triangles, and have low-end machines which run at only 20 FPS (GPU bound), etc.
I have done lightweight scenarios and heavyweight.

The thing about my benchmarks is that I wanted them to be using the same model viewed from the same position so that I could compare each change I made to the previous state of the engine.
I wasn’t intending it to be a real benchmark; I was only trying to see what things improved performance and with a general idea of by how much. The numbers are meant to be compared to the previous numbers above them, not to each other.


The FPS I posted were high.
I posted those results from my home machine, which has two Radeon HD 5870’s crossfired and a Core i7 975. Frankly I have trouble getting anything to run below many thousands of frames per second. It may seem skewed, but that is why I also test on my low-end machines, which are around ~20 FPS for the same objects. The results were the same, but lower numbers.

My only point is that the presentation of my numbers is off, but there is no case in which OpenGL has ever been faster than Direct3D in all tests I have run on all machines.



[color="#1C2837"]
Any amendments, completions or mistakes I made?[/quote]
[color="#1c2837"]Anything about one being easier to learn is heavily subjective. I would personally disagree heavily that OpenGL is easier to learn. It uses less-common terminology for things including pixels (which are “fragments” in OpenGL), is difficult to do “correctly” thanks to mostly outdated tutorials, etc. There are multiple ways to do everything (immediate mode, vertex buffers, VBO’s) so it is more difficult to learn “the right way”.
[color="#1c2837"]Immediate mode makes it easier to get a result on the screen, but if you are trying to use OpenGL properly, the level of difficulty is purely subjective.



[color="#1c2837"][color="#000000"]
Yeah I've been burnt by this a lot (I'm looking at you, nVidia...) -- where an invalid shader compiles without error under one driver, but fails under a stricter driver, which just leads to works on my machine syndrome...[color="#1c2837"][color="#000000"]


My low-end NVidia cards were unable to set boolean uniforms. Literally. I even have a test-case that does nothing but set a boolean which changes the color of a triangle if set. If the OpenGL implementation works correctly it will be red, otherwise blue. Shows red on high-end GeForces and all ATI cards, shows blue on GeForce 8400, GeForce 8600, and GeForce 8800.
Wow. Unable to set a bool.
So in my new engine I will not be using boolean uniforms unless I cannot avoid it.



L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid


heheh has you even tried to enable full screen anti-aliasing with openGl, or tried to draw 2D content, or use VBO, or better has you even tried to make your own GSL script. dude I am telling you OpenGL at the end is more difficult than DX and I find that out after making my own game in opengl and then porting it to DX.
[font="arial, verdana, tahoma, sans-serif"]
Don't take my question too personal mate because we're having a nice objective discussion here. If You're stating an opinion, please add few examples because i believe this post isn't about counting how many people vote for ogl or dx. Probably every 'vs' question brings other: 'in what conditions'.

According to Your question, yes i did, but i didn't even played with DirectX. And yes, I'm new in gamedev (although not in IT), and i'm very curious how it looks compared to Direct3D.[/font]


Another downside is that with OpenGL each driver writer must provide their own shader compiler, whereas with D3D there is a single shader compiler provided by Microsoft. That greatly enhances the consistency and robustness of compiled HLSL shaders. No driver vendor can screw up (or put in dubious optimizations), everyone's shaders get compiled the same way by the same compiler, and the world is a happier place.



I've read in a book, that consorcium has provided compiler front-end implementation for GLSL compiler*, so shaders could be validated by this reference implementation, and just run in vendor-specific one. Is this true, and isn't it sufficient step to nivelate this kind of problems?


Furthermore, I liked idea of giving to vendors more chances to optimize (by running platform). Is this bad in practice, or problems come out because of small competition and interest in providing high quality OpenGL implementations?


* EDIT
I've read in a book, that consorcium has provided compiler front-end implementation for GLSL compiler*, so shaders could be validated by this reference implementation, and just run in vendor-specific one. Is this true, and isn't it sufficient step to nivelate this kind of problems?

Furthermore, I liked idea of giving to vendors more chances to optimize (by running platform). Is this bad in practice, or problems come out because of small competition and interest in providing high quality OpenGL implementations?
Different drivers still accept wildly different forms on GLSL code.
For example, nVidia's drivers actually accept HLSL and Cg keywords, which don't exist at all in the GLSL spec! This is actually a great marketing tactic, because (invalid) shaders that do run on nVidia cards, fail to run on ATI cards, which makes it look like ATI are buggy.

But yes, if you validated your code using a neutral, known-compliant, front-end, you could likely avoid many of these problems.

Regarding vendor optimisation of shaders -- this happens with both GLSL and HLSL. With HLSL, D3D compiles your code down into an assembly language, however, this assembly language cannot run natively on GPUs. The GPU vendors then have to compile this assembly again into their real, native assembly languages.
The HLSL compiler does all of the heavyweight, general purpose optimisations on your code (such as removing dead code, simplifying expressions, etc), and then the drivers perform hardware-specific optimisations, like instruction re-scheduling.

With GLSL, the quality of the first type of optimisation (e.g. dead code removal, etc) varies a lot by platform. To get around this, the Unity team actually compile their GLSL code to assembly using a standard compiler, and the de-compile the assembly back into GLSL again!


If you're authoring a very popular game (which is likely to be used for benchmarks), then you can even expect the folks at ATI/nVidia to secretly extract the shader code from your game, re-write it by hand to be more optimal, and then ship their shader-replacements in their drivers! At runtime, they detect your game, detect your shader, and instead load their hand-tuned replacement shaders to get a better benchmark score from reviewers laugh.gif


My low-end NVidia cards were unable to set boolean uniforms. Literally. I even have a test-case that does nothing but set a boolean which changes the color of a triangle if set. If the OpenGL implementation works correctly it will be red, otherwise blue. Shows red on high-end GeForces and all ATI cards, shows blue on GeForce 8400, GeForce 8600, and GeForce 8800.
Wow. Unable to set a bool.
So in my new engine I will not be using boolean uniforms unless I cannot avoid it.
This is my biggest gripe with GL -- too many potential silent failures.
For example, if I compile a shader that exceeds the GPU's instruction limit, it succeeds... but then runs on the CPU at 0.1Hz.!
If I compile a shader that uses an unsupported hardware feature (e.g. array-index-in-pixel-unit), it succeeds... but then runs on the CPU at 0.1Hz.!
Then there's cases where you try to use a feature, and it still runs on the GPU, but just does nothing at all -- like your bool example.

The problem with these cases is there's no way to know whether the GPU is doing what you're asking of it, except by setting up a test-case, and reading back pixel colours... or setting up a test-case and checking the frame-time to guess whether it ran in hardware or software :/
I've got some questions:

Is it possible in DirectX 11 to set up a program without setting the hole graphic pipeline (e.g. Vertex-, Hull-, Domain-, Pixelshader)?
I didn't find a way to get a triangle on screen without it. (might be an objective point in the "first contact is easier" discussion)

Furthermore: many posts here say that GLSL has a lot of "potential silent failures". How do the existing programs (prof. OpenGL programs as well as OpenGL-games) handle this? I never had problems in running such programs on my computer.
Or is this fact just due to some drivers accepting more than the standard and others seem to have bugs if they stick to the standard. (whatever the standard is...)

Thanks ;)

[quote name='Gl_Terminator' timestamp='1314024529' post='4852316']
[quote name='shdpl' timestamp='1313878028' post='4851735']
[quote name='Gl_Terminator' timestamp='1313512841' post='4849912']
Use directX i was an OpenGL fan but finally i gave up, directX has a lot more support, and complex stuff are handled more easy,

Could you be more specific please?
[/quote]

heheh has you even tried to enable full screen anti-aliasing with openGl, or tried to draw 2D content, or use VBO, or better has you even tried to make your own GSL script. dude I am telling you OpenGL at the end is more difficult than DX and I find that out after making my own game in opengl and then porting it to DX.
[/quote]

glEnable(GL_MULTISAMPLE) to enable full screen anti-aliasing


in Direct3D you have to use vertex buffers too

GLSL isn't any harder/easier than HLSL





[/quote]
Ok, then tell me and equivalent to direct draw(2D) in opengl,

This topic is closed to new replies.

Advertisement