Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


API Wars horror - Will it matter?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
29 replies to this topic

#21 mhagain   Crossbones+   -  Reputation: 9467

Like
1Likes
Like

Posted 22 July 2013 - 12:21 PM

Also if a game company wants to use some obscure new DX function they can call up somebody at Microsoft and have a programmer leased to them until the issue is solved. 

 

If the drivers for a major hardware company is having problems with a new DirectX API, You can be sure Microsoft will throw money at the problem to make it go away. 

OpenGL doesn't have this vastly wealthy champion to ride in and fix problems in a timely fashion.

 

Valve would disagree with you:

 

 

We’ve been working with NVIDIA, AMD, and Intel to improve graphic driver performance on Linux. They have all been great to work with and have been very committed to having engineers on-site working with our engineers, carefully analyzing the data we see. We have had very rapid turnaround on any bugs we find and it has been invaluable to have people who understand the game, the renderer, the driver, and the hardware working alongside us when attacking these performance issues.

 

(source: http://blogs.valvesoftware.com/linux/faster-zombies/)

 

Seriously, there is no big MS anti-OpenGL thing going on; according to Alex St John (who should know, and who has been long enough out of MS to be able to say what he wants) the sole reason for D3D was not shenanigans, it was internal conflict: the Windows 95 team wanted OpenGL, the Windows NT team had the license, Microsoft wanted Windows NT to take on competitors in the workstation market, Windows NT needed OpenGL in order to do that, Windows 95 having OpenGL would undermine NT's position, so the NT team didn't let the 95 guys have the license.  Without the license the only solution was a new API.  (Note that Windows 95 didn't properly get OpenGL until OSR2.)

 

Believe that or believe it not if you wish; personally it makes more sense to me than MS inventing a new (and almost universally reviled) API just for the hell of it, especially considering that they really needed OpenGL for that workstation market.  (If you've ever dealt with MS you'll know that they're fragmented enough to add to the plausability of this story; the fact that they're implementing WebGL in IE11 (admittedly layered on D3D11) is just further evidence.)

 

Regarding the state of OpenGL, reading issue #9 from the new GL_ARB_buffer_storage extension should give you enough of a feel for exactly what's wrong:

 

 

9) What is the meaning of CLIENT_STORAGE_BIT? Is it one of those silly hint things?

DISCUSSION: Unfortunately, yes, it is. For some platforms, such as UMA systems, it's irrelevant and all memory is both server and client accessible. The issue is, that on some platforms and for certain combinations of flags, there may be multiple regions of memory that can satisfy the request (visible to both server and client and coherent to both, for example), but may have substantially different performance characteristics for access from either. This bit essentially serves as a hint to say that that an application will access the store more frequently from the client than from the server. In practice, applications will still get it wrong (like setting it all the time or never setting it at all, for example), implementations will still have to second guess applications and end up full of heuristics to figure out where to put data and gobs of code to move things around based on what applications do, and eventually it'll make no difference whether applications set it or not. But hey, we tried.

 

This is the kind of thing that makes people trying to seriously use the API wince.  D3D has no problems specifying explicit behaviour, and the hardware vendors have to live with it; the end result is that you as a developer get none of that nonsense; when you ask D3D to "give me a dynamic buffer, make it write-only, make it fast" you get a dynamic buffer that's write-only and fast.

 

You want another example?  Did you know that the reason why glGetQueryivARB (GL_QUERY_COUNTER_BITS_ARB) was allowed to return 0 was so that Intel could claim GL1.5 support in hardware that didn't support occlusion queries?  This kind of stuff is just nuts.

 

Yes, you're right about MS's recent model of tying a D3D version to an OS version, which is one thing that could make OpenGL appear a viable option again, provided that the ARB don't screw things up.  Again.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


Sponsor:

#22 BornToCode   Members   -  Reputation: 1013

Like
1Likes
Like

Posted 22 July 2013 - 04:06 PM

Most of the api are the same but they do have some difference for example there is no such thing as Pixel Buffer objects in DirectX. Just wanted to throw my two cent in there. But at the end of the day if you want to write GL api that looks very similar to DirectX11 it is not impossible. As this is something i am currently doing right now.


Edited by BornToCode, 22 July 2013 - 04:07 PM.


#23 mhagain   Crossbones+   -  Reputation: 9467

Like
0Likes
Like

Posted 22 July 2013 - 06:20 PM

Most of the api are the same but they do have some difference for example there is no such thing as Pixel Buffer objects in DirectX.

 

Because it doesn't need them.  They solve a problem in GL that D3D doesn't have (particularly D3D10+ where CopySubresourceRegion and UpdateSubresource are explicitly specified as asynchronous).  Likewise D3D has problems that GL doesn't have, but that's all just API fluff.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#24 MJP   Moderators   -  Reputation: 13611

Like
1Likes
Like

Posted 22 July 2013 - 10:22 PM

 

My bad huh.png That's what I remembered reading in the past, but like I said I never programmed for the PS3.


At the start of the PS3's life time the fact a OGL|ES implementation existed was jumped on by the 'opengl everywhere!' gang and has since been reported as a fact the PS3 uses OpenGL... alas to this day the misinformation exists and thus this common mistake crops up sad.png

 

Yeah I still see that all of the time (especially on general gaming forums) and it drives me nuts. Hopefully the same thing doesn't happen to PS4.



#25 TheChubu   Crossbones+   -  Reputation: 6423

Like
1Likes
Like

Posted 22 July 2013 - 11:07 PM

I'm already seeing it "PS4 uses DirectX!" :P


"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


#26 Buster2000   Members   -  Reputation: 2344

Like
2Likes
Like

Posted 23 July 2013 - 06:07 AM

I would say there is no longer any kind of API War.  There was around 10 years ago when people used to post stuff like "OMG John Carmak uses GL so it must be better".

Nowadays though you need to be able to support everything.   If you are an indie programmer and you want to get your game into an indie bundle then you need to support Linux, Mac and Windows.  If you are a AAA then you often need to support multiple consoles, mobile devices and desktops. 

 

This means that if you are writing your own engine (as in the OPs position) then you need to abstract your engiine and implement the features so that they will work in both OpenGL and also DirectX.



#27 Katie   Members   -  Reputation: 1434

Like
0Likes
Like

Posted 23 July 2013 - 06:47 AM

"AMD, Intel, Microsoft, and Nvidia, all have their own Field Application Engineers that are available to help optimise CPU/GPU codepaths for your product."

 

Actually ARM does this for their mobile GPUs as well -- only they're not FAEs, they're DevRel.



#28 Katie   Members   -  Reputation: 1434

Like
0Likes
Like

Posted 23 July 2013 - 06:53 AM

"This kind of stuff is just nuts"

 

This is what happens when you have a bunch of people who already have hardware that's taken years to design about to go into production and they need to get a spec that describes it rather than being in the position of being able to define the behaviour and say "hit that spec or hit the road".

 

For every story about how OpenGL made a compromise about that sort of thing, there's a story about a bunch of people who've spent two years trying to implement some feature on less silicon only to have the next D3D spec decide to invalidate all that work by flat out re-specifying the behaviour...



#29 LorenzoGatti   Crossbones+   -  Reputation: 3081

Like
0Likes
Like

Posted 24 July 2013 - 02:20 AM



Posted 22 July 2013 - 06:24 AM

Promit, on 22 Jul 2013 - 05:24 AM, said:

Also if a game company wants to use some obscure new DX function they can call up somebody at Microsoft and have a programmer leased to them until the issue is solved. I'm sorry, but this world exists only in your mind. A lot of us are annoyed because DX support from Microsoft is the worst it's been in literally decades.

Really. This only exists in my mind.... If UbiSoft offers to pay Microsoft for one of their DirectX consulting programmers, you don't think that this will happen?

 

Here in the real world, when a user tells a large vendor that they want an horribly bad feature (e.g. an ad-hoc public API addition) implemented in the large vendor's strategic software platform, the large vendor simply tells the user to fuck off, so they don't even ask.

 

Paying the large vendor's consultants gives only a knowledge transfer from the large vendor to the user, in various forms: top tier experts, more competent that the user can expect to have; access to insider information (e.g. Microsoft consultants calling the developers of the relevant product in Redmond for support, or Oracle consultants finding precedents in the restricted-access bug database); sometimes semi-secret software and documentation.

 

The other effect of paying consultants is getting attention and possibly making the large vendor more sensitive to legitimate requests, particularly bug reports (e.g. a game studio discovers that function X doesn't work in the very unusual situation Y).


Omae Wa Mou Shindeiru


#30 TheChubu   Crossbones+   -  Reputation: 6423

Like
0Likes
Like

Posted 24 July 2013 - 09:35 AM

On somewhat related news OpenGL 4.4 has been released, made a thread with links about it.


Edited by TheChubu, 24 July 2013 - 09:45 AM.

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS