Why DirectX ? ( Discussion )

Started by
56 comments, last by Facehat 15 years, 9 months ago
Quote:Original post by godmodder
But then again, the people who are professionals know (or at least are supposed to know) all the details about math and such, so they wouldn't have much trouble programming their own library.


Completely irrelevent. A good engineer doesn't waste his/her time implementing something that's already available to them, even if they know how to do it.

Quote:Original post by godmodder
I noticed alot of people are afraid to use OpenGL, because of the lack of a math library.


And why shouldn't they be? Professionals don't want to have their millions of development dollars pinned on libraries that aren't thoroughly tested. Hobbyists don't have the time to muck around with SSE and assembly. Beginners are just learning the high-level concepts of graphics, having to implement vector math and texture-loading functions gets in the way of that.
Advertisement
I wrote a list of reasons that developing on OpenGL is a miserable experience.

Additionally, OpenGL drivers have historically had problems on Windows -- until recently the ATI implementation was quite poor, and the Intel implementation always has been pretty bad. Consider that Blizzard games (WoW etc) default to Direct3D on Windows, even though they have a perfectly good OpenGL renderer available.

In short, there's two basic reasons people choose D3D over OGL. First, drivers for it on Windows are way better. Second, the API/library itself is just better designed, better written, and easier to work with. Oh, and let's not forget how badly OpenGL 3.0 has been botched.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Interesting topic. (Please note that I know almost nothing about D3D and the D3D API... just in case I say something stupid.)

Quote:Original post by Daaark
I don't care about Mac and Linux. Doesn't bother me that my stuff doesn't run there. I don't own a mac and I never will, and I'll probably never install another Linux distro again, because it's useless to me as a desktop OS. So what is so bad about being locked to windows? Also, the flavor of DX that I have been using lately also works on the Zune and the 360.

Personally, I stopped using OpenGL because I got fed up with it. It falls short in many areas.

When I develop, I do care about the users of my work, which means caring about what they may be using(or prefer to use), not what I prefer to use. I'm a Linux user and, not surprisingly, I know many other Linux and Mac users. It always bothers me when people answer the question of why they didn't develop cross platform with the answer, "Because I use Windows." (Sometimes there are good or strong arguments for why they didn't, but that one certainly isn't.)

There are wrappers for OpenGL. I'm currently developing a small 2D engine that's module-based, and the default module I ship with it uses OpenGL. (If the engine gets far enough, I want to attempt a DirectX module for the Windows platform, actually!) For this, I've had to wrap some of the API, which I actually prefer because I can define my own interfaces. C APIs are very flexible. The sky's the limit in regards to how you want to wrap them! I think coercing DX's already OO API into a module is actually going to be more of a challenge. I'll also have to coerce (convert) my math types (vectors, matrices, etc.) into DX's. Yuck.

It would be great if D3D magically became a cross platform standard someday. Too bad.

I think an interesting project would be to wrap the entire OpenGL API in a very D3D-like object-oriented interface, not exposing a single raw OpenGL function as is.
Thank you all for participating.

Quote:Original post by MJP
Quote:Original post by godmodder
But then again, the people who are professionals know (or at least are supposed to know) all the details about math and such, so they wouldn't have much trouble programming their own library.


Completely irrelevent. A good engineer doesn't waste his/her time implementing something that's already available to them, even if they know how to do it.
I agree, it can be really an advantage for Direct3D to save time and money.

Quote:Original post by Evil Steve
Nitpick: It's Direct3D. DirectX does a lot more than just graphics, whereas OpenGL just handles graphics rendering.
You mean DirectInput and DirectSound ?

Quote:
If you require the new extensions, you loose compatibility to older cards with OpenGL.
For Direct3D we know that D3D9 graphic cards support at least some specific features and that's easy to use them and also inform customers about requirements,
but with existence of extensions for OpenGL which require a certain OpenGL version, what should we do?
I've never seen a game demand an specific OpenGL version to be played, why ?

unfortunately my OpenGL knowledge is not vast, could anyone clarify it?
Quote:Original post by Nima
Quote:Original post by Evil Steve
Nitpick: It's Direct3D. DirectX does a lot more than just graphics, whereas OpenGL just handles graphics rendering.
You mean DirectInput and DirectSound ?
Yes. DirectShow also exists, although it's now part of the Platform SDK. DirectInput's usage is discouraged, and DirectPlay is deprecated (Winsock should be used instead). And DirectDraw, although that's no longer used either (Unless you're doing overlay stuff).
Only thing I can say is: Create an abstract layer, and do both.
Quote:Original post by Daivuk
Only thing I can say is: Create an abstract layer, and do both.


Wow, now that's some seriously bad advice for most people you're giving there, fella. Abstracting away the graphics API used to be the way to approach developing games in the 90ies, when certain cards had horrible drivers for Direct3D or OpenGL. We're far from that now and there's typically no benefit in enabling the user to choose an API for a game.

By creating an abstraction layer you're effectively doing work twice without gaining anything because Direct3D and OpenGL are so similar now. Feature and performance-wise.

Or do you have any specific reasons to back up your suggestion?
An abstraction layer can provide cross platform compatibility. Not only that, but there's no reason to implement that layer with more than one API until there's some demand or reason otherwise to do it, so the cost isn't too great (creating the abstraction in the first place takes time and effort, but you don't need to implement it twice... at least, not unnecessarily).

For example, create a game, implement the graphics layer with Direct3D, ship it, and when enough of the typical "We Want a Linux/Mac Client" threads appear, implement the layer with OpenGL.
Cross-platform application development is more than providing a pluggable rendering backend. Supplying an OpenGL backend won't magically make your app work on Linux/Mac.
Of course not. In my example I only mentioned a graphics layer since we're talking about Direct3D and OpenGL. The entire game could sit atop a layer of abstraction (such as an abstracted game engine). This includes input, audio, etc.

This topic is closed to new replies.

Advertisement