Sign in to follow this  
arnsa

Game development on: Linux or Windows

Recommended Posts

kop0113    2453

Actually Linux distributions are binary compatible with eachother and the LSB mandates that they support the old ABIs(Which

Not many Linux distributions do adhere to the Linux Standard Base. I know there is a package for it in Fedora, but I wouldnt rely on it for every distro. It seems hit and miss, but I guess that is one of the charms of using open-source software smile.png

If you want to run modern software on Linux, do not use RHEL, it is ancient before it gets released

And yet is still newer than Windows XP (Which does still support the Unity output binaries (and IDE)). I am not one to bash Linux (I am a *NIX user afterall) however I feel that something is lacking in the backwards compatibility area of Linux somewhere.
Often Windows users get newer versions of Firefox, Gimp, Libre/OpenOffice before individual (often newer) distributions do.

So I guess all that I am saying is that it is very hard for something like Unity (large proprietary software, always catering to the typical user) to support something as dynamic and fast changing like Linux. Though I guess that is why I never recommend Unity lol.

Share this post


Link to post
Share on other sites
cr88192    1570



Direct3D didn’t work out for me so I hate it.

Fixed.

Basically what I gather from your post is:
#1: You are a fanboy of OpenGL. Anyone who talks down on OpenGL, realistically or not, is a fanboy of Direct3D.
#2: You are an amazing master who knows “proper” and “modern” C++.
#3: You have a chip or 2 on your shoulder regarding Win32 as well.

We try to put our biases aside here.
Please do the same.


L. Spiro


 
 
Allthough to be honest, Win32 is pretty awful if you compare it to for example .Net or QT. (Its better than X11 though, but that isn't really much of an achievement).
 
The whole COM thing takes some time to get used to and it does raise the barrier of entry slightly, i fully understand why michaben hates it, the open alternatives (CORBA etc) aren't much better though and it is one of the better ways to share compiled classes between languages and compilers. It would have been worse if Microsoft had gone with CORBA instead of COM and having a procedural API that requires a wrapper library for each language that uses it really isn't much better.
 
Using Direct3D from python vs using OpenGL from python(Without a third party wrapper) is a good eye opener for how useful COM is.
To use Direct3D from python all you need is the PythonCOM package(Which you can use to take advantage of any COM based library such as those that are included in Microsoft Office (creating a word document from python is just a few lines of code for example, all thanks to COM)), to use OpenGL you need to create a binary (.dll/.so) module specifically for Python and OpenGL(Now since OpenGL and Python are both fairly popular there are allready several such modules to choose from but the same isn't really true for other languages and libraries.


IMHO, slightly nicer could be to wrap the COM APIs somehow, rather than use them as the direct API for each language.

so, COM is one of those "useful, but not always exactly pretty" things.

a hypothetical example here would be if something like OpenGL were made usable via COM, and then the headers would present its usual C API via a pile of macros or similar (*1), and even as such, COM "could" be a generally nicer alternative to the whole {wgl/glX/egl}GetProcAddress thing and fetching function-pointers one-at-a-time. for the most part, it would not look that much different.

*1: macros work, but otherwise there are no "particularly good" ways to do this (other options include static-inline functions and stub-functions, both with their own tradeoffs).

but, it is all tradeoffs (like, whether or not to define the API in terms language-specific wrappers(*2), or use a more portable language-neutral definition at a possible aesthetic risk on various languages, ...).

*2: C API spec, C++ API spec, Java API spec, C# API spec, ... with possible risk of "mismatches" between the language-specific API definitions (where something looks or works notably different in one language than another), meaning the level of "naturalization" could be itself a tradeoff.

but, all this is more a matter for API designers, where for the most part users either just have to live with it, or wrap over it.

Share this post


Link to post
Share on other sites
cr88192    1570


Actually Linux distributions are binary compatible with eachother and the LSB mandates that they support the old ABIs(Which

Not many Linux distributions do adhere to the Linux Standard Base. I know there is a package for it in Fedora, but I wouldnt rely on it for every distro. It seems hit and miss, but I guess that is one of the charms of using open-source software smile.png

If you want to run modern software on Linux, do not use RHEL, it is ancient before it gets released

And yet is still newer than Windows XP (Which does still support the Unity output binaries (and IDE)). I am not one to bash Linux (I am a *NIX user afterall) however I feel that something is lacking in the backwards compatibility area of Linux somewhere.
Often Windows users get newer versions of Firefox, Gimp, Libre/OpenOffice before individual (often newer) distributions do.

So I guess all that I am saying is that it is very hard for something like Unity (large proprietary software, always catering to the typical user) to support something as dynamic and fast changing like Linux. Though I guess that is why I never recommend Unity lol.


IMO, the lack of backwards compatibility (or cross-distro compatibility) is one of the bigger problem areas in Linux.

having things like the core ABI and OS APIs nailed down, and having people avoid making non-backwards-compatible changes, would help here, but it is hard trying to get a large crowd of programmers to all agree to this (where, for many of them, "backwards compatibility" is "several months ago", rather than "several years ago" or "a decade ago", and trying to keep old builds of apps working may seem like an "unnecessary burden"). (and, by Linux standards, an installation of a 2011-era distro is "ancient", and newer apps refuse to build or work due to it having too old of versions of various libraries, ...).

it doesn't tend to help matters much that many Linux apps often tend to have a large number of dependencies, each with their own independent version issues, making binary compatibility (and sometimes source-code compatibility) often a problematic goal.


it is like asking people to not change the arguments lists for API functions, or endlessly redefine their magic-values and flags.
for sake of compatibility, things like this generally have to be "set in stone", among other things (like, for example, the physical layout of any publicly-visible structures or classes, ...).

but, it doesn't help matters much when one can find minor differences in the ABI as implemented by one version of GCC and another (like, version-specific minor differences in the name-mangling or how particular structure types get passed/returned, ...).
a person might find a situation (*cough* x86-64 *cough*) where the ABI spec is overcomplicated (particularly regarding passing/returning structures), GCC doesn't always follow it precisely, and the ABI-as-implemented-by-GCC tends to vary slightly. leading to me just being like "hell with it, I am going to use a simplified subset for my stuff and call it good enough". IOW: using a modified ABI subset where structs are either passed in a single register (such as R9 or XMM1), or are passed as a reference (generally ignoring the whole "decomposing structs into collections of registers" thing).

this is apart from more major jumps (like a few years back, jumping from one C++ name-mangling scheme to another), or just strange edge-cases (such as cases where GCC and MSVC interpret the same ABI differently on Windows, like GCC assuming 16-byte aligned 32-bit cdecl but MSVC only 4-byte alignment, or the significance of "long double", ...).

so, in all, binary compatibility is an awkward issue sometimes, as generally it assumes that most all of this stuff is "set in stone".

Share this post


Link to post
Share on other sites
mhagain    13430

Basically what I gather from your post is:
#1: You are a fanboy of OpenGL. Anyone who talks down on OpenGL, realistically or not, is a fanboy of Direct3D.
#2: You are an amazing master who knows “proper” and “modern” C++.
#3: You have a chip or 2 on your shoulder regarding Win32 as well.

 

Oh please, grow up kid. I have been using both Direct3D and OpenGL for ages. Maybe if you would actually *read* my post, you would see that I didn't advice the usage of OpenGL at all, only state the truth about Direct3D and try to safe people from learning C++ the wrong way by jumping into it to soon. But of course, I could have guessed some pathetic troll would come in...

 

I think anyone who's been using the forum for any amount of time knows that L Spiro is anything but a troll.

 

From reading your post, it seems obvious that you likely haven't done much with either API; yes, you may have been using them for a long time, but have you actually used much OpenGL beyond, say, 1.4?  And have you gone looking for an equivalent of OpenGL's immediate mode in D3D?

 

The reason why I ask these things is because if you had done any kind of extensive work with vertex buffers, shaders and render-to-texture, you'd know straight-up that D3D is the infinitely cleaner API, whereas the OpenGL APIs for these are - and I'll be as kind as possible here - holy-jumping-Jesus-freaking-batsh-t-insane.

 

I also find it ironic that two of the criticisms you make against D3D - pointers everywhere and not using modern C++ - apply equally to OpenGL.

 

Finally, and just to be ultra-clear about this, I don't have any horse in this race.  I just want a good API that behaves reasonably consistent and predictable across a wide range of hardware, and that doesn't involve reading signs in chicken intestines for the Right Way To Do Stuff.  Right now that API is D3D, but if the ARB pull the finger out and fix the core broken/crufty/insane stuff, and if the hardware vendors follow with some good drivers, it could be OpenGL again.

Share this post


Link to post
Share on other sites
L. Spiro    25622

But of course, I could have guessed some pathetic troll would come in...
I wwebsite as on the internet

Perhaps some perspective would help to add some…
…perspective.

The fanboys here will probably never admit it, but D3D is a horrible API to work with for beginner programmers.

You immediately take an authoritative position and assert yourself as absolutely positively being right about the quality of Direct3D’s API, furthered by classifying anyone who says it is a good API, which includes multiple knowledgeable people who have posted in this very thread, as being fanboys of it.
See how you just waltzed in and stepped on some toes?

I would strongly recommand against jumping into Direct3D, as it will teach you lots of bad practices in modern C++.

The problem is that there are about a million other things as well that will teach bad object-oriented design. Singling out Direct3D doesn’t make sense, speaking from personal experience.
I, as a normal person, as a child, was not so strong in programming when I took a peak at DirectX 6. Did I learn bad habits from DirectX? No. I never viewed it as an example of object-oriented design. It was a way to get things done and nothing more.
If you are so afraid of people picking up bad programming habits, you should remind them also not to look at almost any online tutorial and to select only a few books out there for reading material. In fact, better to just play it safe and tell them to quit programming.

For the same reason you should avoid using WIN32 API like the plague when you are new to programming, and once you mastered C++ you probably still want to avoid it, but then at least you can tell it's bad, rather then get confused of what is proper modern C++ and what is pure evilness.

Once again you assert yourself as the authority on good and bad, also this time around telling the “masters” what to avoid. As if you yourself are more than a master. Do you see how this comes off as nothing short of conceit?

I learned the Win32 API at a young age. Yes it was a pain at times. And ultimately a learning experience.
An event in my youth caused me to shut out the pain and focus on the happy. My cousin imparted the words of wisdom, “You can’t truly appreciate happiness unless you experience the sadness,” onto me.
It goes for programming too. When I was faced with design decisions later, I recalled some of the headaches I had with the Win32 API and decided not to go that route.

So stop sitting on your throne and telling people what to avoid (exception: Three20 for iOS). It’s all part of the growing process.

 

…only state the truth about Direct3D and try to safe[sic] people from learning C++ the wrong way by jumping into it to soon.

We appreciate objectivity on these forums. And being mindful of what others have already posted rather than just trying to trump them all and step on their toes.


Ultimately I think you are just reading too much into it. I’ve never thought of a graphics API as serving any other purpose than to get graphics on the screen. Never as a model for good programming practices or good object-oriented design. Since it never even occurred to me to do so, I never imagined it would occur to anyone else. I don’t think it does.
If people are prone to pick up bad habits, they will do so anyway, more often from online tutorials than anything else. If people have good enough sense, they will separate academic object-oriented programming from API’s (not just Win32, DirectX, OpenGL, Glide, OpenAL, but any), understanding that the API was designed that way for a reason other than academic prowess.


L. Spiro

Share this post


Link to post
Share on other sites
SimonForsman    7642


Actually Linux distributions are binary compatible with eachother and the LSB mandates that they support the old ABIs(Which

Not many Linux distributions do adhere to the Linux Standard Base. I know there is a package for it in Fedora, but I wouldnt rely on it for every distro. It seems hit and miss, but I guess that is one of the charms of using open-source software smile.png

If you want to run modern software on Linux, do not use RHEL, it is ancient before it gets released

And yet is still newer than Windows XP (Which does still support the Unity output binaries (and IDE)). I am not one to bash Linux (I am a *NIX user afterall) however I feel that something is lacking in the backwards compatibility area of Linux somewhere.
Often Windows users get newer versions of Firefox, Gimp, Libre/OpenOffice before individual (often newer) distributions do.

So I guess all that I am saying is that it is very hard for something like Unity (large proprietary software, always catering to the typical user) to support something as dynamic and fast changing like Linux. Though I guess that is why I never recommend Unity lol.


Off the top of my head, the only big name Linux based x86/x64 OS that doesn't support the LSB is Android (Let me know if you can find any other). Unity games fail to run because it requires a very new glibc version(and possibly new versions of other system libraries), it is not a backwards compatibility issue, its a developer issue. (Either they don't know how to build binary only applications for Linux or they just don't care about supporting anything other than Ubuntu)

You seem generally confused as to what backwards compatibility means, Backwards compatibility is the ability of a new OS to run old applications, This is a non issue and hasn't been an issue for several years. (since LSB3.0 ABI symbols will not be changed or removed without the major version of the library changing (Which means applications relying on the old symbols will keep using the old version, even on a new OS), new ones can be added though but that will never affect existing software).

backwards compatibility is guaranteed (for the system ABIs, there is ofcourse no guarantee, atleast not from the Linux foundation that amd or nvidia won't drop OpenGL 3.0 support from their driver tomorrow(its highly unlikely though) and break things for people), You are hung up on Unity3D not working on RHEL6 and other "old" distros, blaming Linux for the fact is about as intelligent as Blaming Microsoft for Windows 7 not being able to run Windows8 "metro" apps. The Unity developers have specifically and for no good reason chosen to set the minimum requirement at Ubuntu 11.04, The "correct" glibc version to use for a proprietary modern(not targeting old distros) binary application would be 2.4(Easiest way would be to install the LSB SDK and just build against the LSB4.0 target (any big name 2009 or later(Which includes RHEL6) distro will run those applications with no issues, or LSB3.0 if support for older distros(back to 2005) is desired).

It is not rocketscience, building cross-distro, future proof Linux binaries is reasonably trivial today and unless you need to use extremely new OS features(Unity3D shouldn't need that) it is also trivial to support distros as old as 2005 (for pre 2005 distros there is no guarantee that the ABIs will continue to be supported in the future so they should either be ignored completely or get separate binaries and if this was 2004 and not 2013 i would even agree with you, its not 2004 anymore though) Edited by SimonForsman

Share this post


Link to post
Share on other sites
Bregma    9202

The Unity developers have specifically and for no good reason chosen to set the minimum requirement at Ubuntu 11.04, The "correct" glibc version to use for a proprietary modern(not targeting old distros) binary application would be 2.4(Easiest way would be to install the LSB SDK and just build against the LSB4.0 target (any big name 2009 or later(Which includes RHEL6) distro will run those applications with no issues, or LSB3.0 if support for older distros(back to 2005) is desired).

Well, the "no good reason" for targeting consumer-oriented Ubuntu is because they get commercial support from Canonical.  Red Hat doesn't even have a desktop GNU/Linux distribution, let alone offer commercial support for one, so it's hardly a surprise that a game development kit doesn't target RHEL.

 

Why would anyone consider writing graphics-oriented games for a server system?  Do dev-ops have that much time on their hands?

Share this post


Link to post
Share on other sites
kop0113    2453

Well, the "no good reason" for targeting consumer-oriented Ubuntu is because they get commercial support from Canonical.  Red Hat doesn't even have a desktop GNU/Linux distribution, let alone offer commercial support for one, so it's hardly a surprise that a game development kit doesn't target RHEL.
 
Why would anyone consider writing graphics-oriented games for a server system?  Do dev-ops have that much time on their hands?

Other than server versions, Red Hat does an Enterprise Workstation and Desktop version.
Scientific Linux (a clone of RHEL by fermilab and CERN) is also used a lot in 3D Virtualization of "things". Unity would be daft to not make small tweaks to their software to target these platforms. I might mention it on the beta mailing lists because perhaps no-one else has actually tried.

It sounds like a few people still need a little bit more education and to do some more research before Linux ever has a chance of being popular with the masses. It will come one day though... ;)

What we need to do is get away from the belief that Ubuntu is the only Linux distro. As SimonForsman correctly said, it isn't too hard to write code that works on all existing Linux distros... developers just need to learn how to write slightly more portable code (like they have had to do for the various versions of Windows over the years).

Share this post


Link to post
Share on other sites
BrokenKingpin    236

OpenGL is a perfectly good way to go, and is cross platform. A lot of game engines use OpenGL, so there is no technical limit they they could not release for Linux/Unix systems, they just don't spend the time to support the platform.

 

You could also use an existing cross platform game engine. To be honest, it is more about learning good game design practices than a specific API. You can always learn a new API and apply those same principals you learned.

Share this post


Link to post
Share on other sites
MichaBen    481

Perhaps some perspective would help to add some…
…perspective.

If you love perspective so much, you should read further then this post, for example at the extremely biased posts by phantom, where my post was mainly made in response of. Calling things 'pants-on-head retarded' is typical behaviour of trolling fanboys. Maybe I should have been more clear in stating who I responded to, but it's odd how trolling fanboys are allowed to post garbage, but it's not allowed to post a response to it in the same manner.

 

About the biased information being posted here that OpenGL is bad, DirectX has many of the same (or similar) problems that OpenGL gets blamed here. Sure, there are problems with vendor implementations of OpenGL. But the exact same thing is true for DirectX. Only a few weeks ago, we had problems with completely random crashes with AMD cards with a HLSL shader that worked fine on NVIDIA cards. And this is not the first time I have seen differences in HLSL shaders between NVIDIA and AMD drivers. This is just something you have to deal with, no matter which API you are using. And while I admit that the bind-to-edit model can be a pain at times, and I would gladly accept a different solution, it's hardly fair to badmouth the whole API for that. First of all in a well designed engine you will hardly ever suffer problems from this, and secondly because also with DirectX it's a piece of cake to mess up something because you call functions in the wrong order. Or what about the DirectX functions that return you a pointer with increased reference count you have to remember to release? How many times have you seen that going wrong with inexperienced DX programmers, and even with experienced programmers? Or the D3DXVECTOR3 structure that is completely messed up with an evil implicit conversion operator to float pointer (and yes, I have actually seen that causing bugs in an engine where implicit conversion to float * was used by the compiler when resolving the proper function overload).

 

In the end, I don't think there is a 'better' in terms of easier, or less likely to shoot yourself in the foot with. Personally I did find OpenGL easier, and of course the cross-platformness is a pro, not just Windows-Mac-Linux but OpenGL also supports more Windows versions then Direct3D does. Of course, if your goal is to make a game, then you have a third choice which is use neither and find a engine to use, which is even easier.

Share this post


Link to post
Share on other sites
_the_phantom_    11250

If you love perspective so much, you should read further then this post, for example at the extremely biased posts by phantom, where my post was mainly made in response of. Calling things 'pants-on-head retarded' is typical behaviour of trolling fanboys. Maybe I should have been more clear in stating who I responded to, but it's odd how trolling fanboys are allowed to post garbage, but it's not allowed to post a response to it in the same manner.

The difference is my post is commenting on a fundamental flaw in the OpenGL programming model where as yours touches upon COM (a minor point of D3D11 coding, mostly related to start up) and C++ form which OpenGL also doesn't prompt in sensible way.

If you can convince me that having to bind a resource to the pipeline in order to edit it is a good idea then I'll be impressed.
Not to mention that binding anything makes it editable which can lead to fun such as when you bind a VAO you can, via a bug, accidently change its content just because it is bound.

Pants. On. Head.

D3D programming model, on the other hand, has both immutable objects AND doesn't require you to have a resource bound to edit it. The rest of the API over that is just gravy.

Oh, and for the record, I spent about 8 years working exclusively with OpenGL, including writing a chapter on GLSL for a book back in 2005, until the ARB finally screwed up one time to many (see GL3.0/Longs Peak) and I took a look at the D3D world where, since D3D10, things are saner to work with.

Share this post


Link to post
Share on other sites
ApochPiQ    23005
Let's keep it civil in here, folks.

I'm tempted to lock this just because D3D/OGL debates are inevitably a waste of time, but I'll let it live for the moment. If, however, we can't have a discussion without making a lot of attacks and accusations, I'll be happy to revisit that decision.

Share this post


Link to post
Share on other sites
SimonForsman    7642


The Unity developers have specifically and for no good reason chosen to set the minimum requirement at Ubuntu 11.04, The "correct" glibc version to use for a proprietary modern(not targeting old distros) binary application would be 2.4(Easiest way would be to install the LSB SDK and just build against the LSB4.0 target (any big name 2009 or later(Which includes RHEL6) distro will run those applications with no issues, or LSB3.0 if support for older distros(back to 2005) is desired).

Well, the "no good reason" for targeting consumer-oriented Ubuntu is because they get commercial support from Canonical.  Red Hat doesn't even have a desktop GNU/Linux distribution, let alone offer commercial support for one, so it's hardly a surprise that a game development kit doesn't target RHEL.
 
Why would anyone consider writing graphics-oriented games for a server system?  Do dev-ops have that much time on their hands?


Noone should write anything for RHEL or Ubuntu or <insert name of random distro here>, thats the whole point, If you're not going to release the code target the damn LSB instead of individual distros, There is a SDK for it (finally) that makes it piss easy. There simply is no good reason not to do it. (By targeting the LSB you got a guaranteed stable ABI, your application will not break in future compliant distributions, if you target Ubuntu X.Y you are dealing with a non standard ABI that no distribution, not even the one that used it in the first place is obligated to support in the future. It is insane to target a specific distribution with a binary release today since it saves no time or effort and could come back to bite you in the ass in the future (resulting in a broken product or more work).

Share this post


Link to post
Share on other sites
cr88192    1570


If you love perspective so much, you should read further then this post, for example at the extremely biased posts by phantom, where my post was mainly made in response of. Calling things 'pants-on-head retarded' is typical behaviour of trolling fanboys. Maybe I should have been more clear in stating who I responded to, but it's odd how trolling fanboys are allowed to post garbage, but it's not allowed to post a response to it in the same manner.

The difference is my post is commenting on a fundamental flaw in the OpenGL programming model where as yours touches upon COM (a minor point of D3D11 coding, mostly related to start up) and C++ form which OpenGL also doesn't prompt in sensible way.

If you can convince me that having to bind a resource to the pipeline in order to edit it is a good idea then I'll be impressed.
Not to mention that binding anything makes it editable which can lead to fun such as when you bind a VAO you can, via a bug, accidently change its content just because it is bound.

Pants. On. Head.


it isn't perfect, but it works.

typically more annoying is trying to figure out when/where a given state-change has occured, like say in one condition stuff looks fine, and in another condition everything goes weird, only to return to normal again a moment later, and then finding out it is because a piece of code somewhere in the rendering pass bound a shader or set the blend-mode or similar without setting it back.

well, and it is a little annoying/limiting to have to do pretty much everything rendering related from a single thread, ...

but, in all, it is more all about trade-offs.

nothing is perfect, FWIW...

for what things it is not within ones' power to change, they can take it as it is, and make "locally optimal" tradeoffs within the set of available options.

it is sort of like the choice of programming language, where although a language has drawbacks, there would be worse drawbacks involved with using some of the other languages, ...

D3D programming model, on the other hand, has both immutable objects AND doesn't require you to have a resource bound to edit it. The rest of the API over that is just gravy.

Oh, and for the record, I spent about 8 years working exclusively with OpenGL, including writing a chapter on GLSL for a book back in 2005, until the ARB finally screwed up one time to many (see GL3.0/Longs Peak) and I took a look at the D3D world where, since D3D10, things are saner to work with.

yes, but the issue is being either tied to Windows or having to write/maintain 2 versions of a renderer isn't ideal either.

granted, a person can probably gloss over a lot of this (minimizing dependencies on the specific rendering backend, mostly to try to limit how much work is needed to move to a new backend, much like trying to gloss over depending on a specific set of OS APIs or a specific CPU architecture).

say, the high-level deals more with materials and geometry (like mesh-objects containing triangle-arrays or similar), and tries to largely gloss over how it is rendered (and probably with multiple versions of the various shader programs).


but, in some ways, portability doesn't come free.

Share this post


Link to post
Share on other sites
_the_phantom_    11250

it isn't perfect, but it works.

Well, "works" in that you have to use it as you've no other choice than to suck the pain and live with it.

That's the thing about OpenGL; it's the best choice if you have no choice.

(OpenGL is also the ONLY API to enforce this model; PS2, PS3, X360, Wii/WiiU, D3D9/10/11 - none of these use this model.)

yes, but the issue is being either tied to Windows or having to write/maintain 2 versions of a renderer isn't ideal either.

The point is that better programming models exist; pointing out minor unrelated flaws with D3D's API (COM & 'not modern C++') does not detract from OpenGL's model being fundamentally broken which was the original point which apparently triggered the rage.

For the record; I'd personally do two API level interfaces. If you are targeting D3D11 and OpenGL then the workload isn't going to be that high and the rest of your renderer is likely to swamp it out code wise.

(I'm currently involved in a complete ground up re-build of our renderer at work and we've taken the path of doing a renderer backend per platform, which means we need to support at least 4 code paths but it does mean we can do API and platform specific optimisations and paths while getting the best from each API. Unfortunately at some point this will probably mean I'll have to touch OpenGL|ES... *sigh*)

Share this post


Link to post
Share on other sites
mhagain    13430

but, in all, it is more all about trade-offs.


nothing is perfect, FWIW...

for what things it is not within ones' power to change, they can take it as it is, and make "locally optimal" tradeoffs within the set of available options.

 

Thing is though, there is absolutely no tradeoff with the bind-to-modify model.  The model itself gives you absolutely nothing in return, and introduces a whole heap of needless and painful careful state tracking.  Complex and fiddly code paths that have unwanted consequences have a price beyond the time taken to write them; you've also got to maintain the mess going forward, and if it's bad enough it can act to prevent you from introducing cool new features that would otherwise be utter simplicity.

 

This was obviously broken when GL_ARB_multitexture was introduced, was even more problematic when GL_ARB_vertex_buffer_object was introduced, and the ARB themselves are showing little inclination to resolve it.  Thank heavens for GL_EXT_direct_state_access, which even id Software are using in much of their more recent code (see https://github.com/id-Software/DOOM-3-BFG/blob/master/neo/renderer/Image_load.cpp#L453 for example).

 

What's particularly annoying is that bind-to-modify was recognised as a potential problem as far back as the original GL_EXT_texture_object!  See http://www.opengl.org/registry/specs/EXT/texture_object.txt.  Even more annoying is that while some good new functionality has come in with a civilized DSA API - sampler objects, the promotion of the glProgramUniform calls to core - a whole heap has also come in without one - vertex attrib binding, texture storage, etc.

 

Shrugging it off with "ah just accept it, sure that's the price of portability" is not good enough; OpenGL used to be a great API and should be one again, and people acting as though this is not a problem (or even worse - acting as though it's somehow a good thing) are not helping that one little bit.

Edited by mhagain

Share this post


Link to post
Share on other sites
cr88192    1570


but, in all, it is more all about trade-offs.

nothing is perfect, FWIW...

for what things it is not within ones' power to change, they can take it as it is, and make "locally optimal" tradeoffs within the set of available options.

 
Thing is though, there is absolutely no tradeoff with the bind-to-modify model.  The model itself gives you absolutely nothing in return, and introduces a whole heap of needless and painful careful state tracking.  Complex and fiddly code paths that have unwanted consequences have a price beyond the time taken to write them; you've also got to maintain the mess going forward, and if it's bad enough it can act to prevent you from introducing cool new features that would otherwise be utter simplicity.
 
This was obviously broken when GL_ARB_multitexture was introduced, was even more problematic when GL_ARB_vertex_buffer_object was introduced, and the ARB themselves are showing little inclination to resolve it.  Thank heavens for GL_EXT_direct_state_access, which even id Software are using in much of their more recent code (see https://github.com/id-Software/DOOM-3-BFG/blob/master/neo/renderer/Image_load.cpp#L453 for example).
 
What's particularly annoying is that bind-to-modify was recognised as a potential problem as far back as the original GL_EXT_texture_object!  See http://www.opengl.org/registry/specs/EXT/texture_object.txt.  Even more annoying is that while some good new functionality has come in with a civilized DSA API - sampler objects, the promotion of the glProgramUniform calls to core - a whole heap has also come in without one - vertex attrib binding, texture storage, etc.
 
Shrugging it off with "ah just accept it, sure that's the price of portability" is not good enough; OpenGL used to be a great API and should be one again, and people acting as though this is not a problem (or even worse - acting as though it's somehow a good thing) are not helping that one little bit.


but, normal programmers don't exactly have a whole lot of say in all this (this being more a vendor and ARB issue), so whether or not it is *good*, is a secondary issue, to whether or not programmers have much choice in the matter.

people have say in things they can influence or control, otherwise, they will just have to live with whatever they are given.


it is along similar lines to complaining about weaknesses in the x86 or ARM ISAs, or the SysV/AMD64 ABI, or some design issues in the core of C and C++, ... these things have problems, but for most end-developers, there isn't a whole lot of choice.

most people will just live with it, as a part of the natural cost of doing business...

the more significant question, then, is what sorts of platforms they want their code to run on, and this is where the tradeoffs come in.

Share this post


Link to post
Share on other sites
mhagain    13430

 

but, in all, it is more all about trade-offs.

nothing is perfect, FWIW...

for what things it is not within ones' power to change, they can take it as it is, and make "locally optimal" tradeoffs within the set of available options.

 
Thing is though, there is absolutely no tradeoff with the bind-to-modify model.  The model itself gives you absolutely nothing in return, and introduces a whole heap of needless and painful careful state tracking.  Complex and fiddly code paths that have unwanted consequences have a price beyond the time taken to write them; you've also got to maintain the mess going forward, and if it's bad enough it can act to prevent you from introducing cool new features that would otherwise be utter simplicity.
 
This was obviously broken when GL_ARB_multitexture was introduced, was even more problematic when GL_ARB_vertex_buffer_object was introduced, and the ARB themselves are showing little inclination to resolve it.  Thank heavens for GL_EXT_direct_state_access, which even id Software are using in much of their more recent code (see https://github.com/id-Software/DOOM-3-BFG/blob/master/neo/renderer/Image_load.cpp#L453 for example).
 
What's particularly annoying is that bind-to-modify was recognised as a potential problem as far back as the original GL_EXT_texture_object!  See http://www.opengl.org/registry/specs/EXT/texture_object.txt.  Even more annoying is that while some good new functionality has come in with a civilized DSA API - sampler objects, the promotion of the glProgramUniform calls to core - a whole heap has also come in without one - vertex attrib binding, texture storage, etc.
 
Shrugging it off with "ah just accept it, sure that's the price of portability" is not good enough; OpenGL used to be a great API and should be one again, and people acting as though this is not a problem (or even worse - acting as though it's somehow a good thing) are not helping that one little bit.

 

but, normal programmers don't exactly have a whole lot of say in all this (this being more a vendor and ARB issue), so whether or not it is *good*, is a secondary issue, to whether or not programmers have much choice in the matter.

people have say in things they can influence or control, otherwise, they will just have to live with whatever they are given.


it is along similar lines to complaining about weaknesses in the x86 or ARM ISAs, or the SysV/AMD64 ABI, or some design issues in the core of C and C++, ... these things have problems, but for most end-developers, there isn't a whole lot of choice.

most people will just live with it, as a part of the natural cost of doing business...

the more significant question, then, is what sorts of platforms they want their code to run on, and this is where the tradeoffs come in.

 

Normal programmers do have a say.  They can vote with their feet and walk away, which is exactly what has happened.  That's a power that shouldn't be underestimated - it's the same power that forced Microsoft to re-examine what they did with Vista, for example.

 

Regarding platforms, this is a very muddied issue.

 

First of all, we can discount mobile platforms.  They don't use OpenGL - they use GL ES, so unless you restrict yourself to a common subset of both, you're not going to hit those (even if you do, you'll get more pain and suffering from trying to get the performance up and from networking/sound/input/windowing system APIs than from developing for 2 graphics APIs anyway).

 

We can discount consoles.  Even those which have GL available, it's GL ES too, and anyway the preferred approach is to use the console's native API instead.

 

That leaves 3 - Windows, Mac and Linux.  Now things get even muddier.

 

The thing is, there are actually two types of "platform" at work here - software platforms (already mentioned) and hardware platforms (NV, AMD and Intel).  Plus they don't have equal market shares.  So it's actually incredibly misleading to talk in any way about number of platforms; instead you need to talk about the percentage of your potential target market that you're going to hit.

 

Here's the bit where things turn upside-down.

 

In the general case for gaming PCs we're talking something like 95% Windows, 4% Mac and 1% Linux.  So even if you restrict yourself to something that's Windows-only, you're still potentially going to hit 95% of your target market.

 

Now let's go cross-platform on the software side, and look at those hardware platforms I mentioned.  By being cross-platform in software you're hitting 100% of your target market, but - and it's a big but - OpenGL only runs reliably on one hardware platform on Windows, and that's NV.  Best case (according to the latest Steam hardware survey) is that's 52%.

 

So, by being Windows-only you hit 95% of your target market but it performs reliably on 100% of those machines.

By being cross-platform you hit 100% of your target market but it performs reliably on only 52% of those machines.

 

That sucks, doesn't it?

 

Now, maybe you're developing for a very specialized community where the figures are skewed.  If so then you know your audience and you go for it.  Even gaming PCs (which I based my figures on) could be considered a specialized audience, but in the completely general case the figures look even worse.  There's an awful lot of "home entertainment", "multimedia" or business-class PCs out there with Intel graphics, there's an awful lot of laptops, there's an awful lot of switchable-graphics monstrosities, there's an awful lot of users with OEM drivers, there's an awful lot of users who never upgrade their drivers.

 

And that's the final reality - it's not number of platforms that matters; that doesn't matter at all.  It's percentage of target markets, and outside of specialized communities being cross-platform will get you a significantly lower percentage than being Windows-only.

Edited by mhagain

Share this post


Link to post
Share on other sites
cr88192    1570

Normal programmers do have a say.  They can vote with their feet and walk away, which is exactly what has happened.  That's a power that shouldn't be underestimated - it's the same power that forced Microsoft to re-examine what they did with Vista, for example.
 
Regarding platforms, this is a very muddied issue.
 
First of all, we can discount mobile platforms.  They don't use OpenGL - they use GL ES, so unless you restrict yourself to a common subset of both, you're not going to hit those (even if you do, you'll get more pain and suffering from trying to get the performance up and from networking/sound/input/windowing system APIs than from developing for 2 graphics APIs anyway).
 
We can discount consoles.  Even those which have GL available, it's GL ES too, and anyway the preferred approach is to use the console's native API instead.

I am including OpenGL ES, while not strictly traditional OpenGL, it is close-enough to where a renderer can target both (granted, with some glossing, wrapping, and ifdef's in a few areas).

That leaves 3 - Windows, Mac and Linux.  Now things get even muddier.
 
The thing is, there are actually two types of "platform" at work here - software platforms (already mentioned) and hardware platforms (NV, AMD and Intel).  Plus they don't have equal market shares.  So it's actually incredibly misleading to talk in any way about number of platforms; instead you need to talk about the percentage of your potential target market that you're going to hit.
 
Here's the bit where things turn upside-down.
 
In the general case for gaming PCs we're talking something like 95% Windows, 4% Mac and 1% Linux.  So even if you restrict yourself to something that's Windows-only, you're still potentially going to hit 95% of your target market.
 
Now let's go cross-platform on the software side, and look at those hardware platforms I mentioned.  By being cross-platform in software you're hitting 100% of your target market, but - and it's a big but - OpenGL only runs reliably on one hardware platform on Windows, and that's NV.  Best case (according to the latest Steam hardware survey) is that's 52%.
 
So, by being Windows-only you hit 95% of your target market but it performs reliably on 100% of those machines.
By being cross-platform you hit 100% of your target market but it performs reliably on only 52% of those machines.
 
That sucks, doesn't it?

I have generally had good enough success with OpenGL on ATI cards, so no huge issue here (previously, I was doing a lot of development using an ATI card, but currently I am using an NV card).

the big suck usually comes up with Intel chipsets, but they don't really work well in general IME.

Now, maybe you're developing for a very specialized community where the figures are skewed.  If so then you know your audience and you go for it.  Even gaming PCs (which I based my figures on) could be considered a specialized audience, but in the completely general case the figures look even worse.  There's an awful lot of "home entertainment", "multimedia" or business-class PCs out there with Intel graphics, there's an awful lot of laptops, there's an awful lot of switchable-graphics monstrosities, there's an awful lot of users with OEM drivers, there's an awful lot of users who never upgrade their drivers.
 
And that's the final reality - it's not number of platforms that matters; that doesn't matter at all.  It's percentage of target markets, and outside of specialized communities being cross-platform will get you a significantly lower percentage than being Windows-only.

doesn't do much for those of us who *do* use Linux sometimes though.

it also ignores, however, the possibility that the Windows branch of a 3D engine can also use D3D, if needed, while keeping an OpenGL backend around for portability.

but, OpenGL makes a good baseline, if a person doesn't want to be locked to a single target.

or, they can use OpenGL-ES, if targeting the cell-phone or browser-games market.


it is like, most of the world still uses 32-bit x86 for apps, but writing code that only works on 32-bit x86 still isn't a good idea.
"what if we need to build for 64-bits? or on a target running ARM?", "who ever heard of such a thing!".

doesn't mean a person can't have target-specific code (such as for performance or similar), but it shouldn't really be mandatory for basic operation either.

Share this post


Link to post
Share on other sites
wintertime    4108

I knew at start this would get into a OpenGL vs D3D debate. And when I see them I ultimately think how pointless they are, because whats actually running all those fancy graphics is the hardware from NV or ATI and the drivers made by NV and ATI and those apis are just a thin(if not thin they would be inefficient) shell around this, which sole purpose is to unify access to these different drivers and hardware. And if one of those two api gives access to more types of systems it seems to me that it fulfills that purpose of unifying better.

Now sure you can target a large percentage of all machines with that other api and it looks shiny atm, but thats like when people thought "oh those Enron shares fare so well" and put all their money into it; at that moment it may have looked good, but if you make yourself dependent on a single company you dont control, it could all be over in 1 day, 1 month or 1 year and even though its very unlikely this company goes under noone wants to bet everything on that. That applies not only to this, but I feel its same with many other choices, like when people are only programming for iOS and then possibly realizing their app wont get accepted for whatever obscure reason.

 

 

Now for the threadstarter this is all irrelevant. He is still young and just learning programming so he could just pick anything he feels comfortable with and if he doesnt feel comfortable with Windows he could just loose interest in programming if he pressures himself into using it, even when later he needs to know about it.

Share this post


Link to post
Share on other sites
mhagain    13430

I knew at start this would get into a OpenGL vs D3D debate. And when I see them I ultimately think how pointless they are, because whats actually running all those fancy graphics is the hardware from NV or ATI and the drivers made by NV and ATI and those apis are just a thin(if not thin they would be inefficient) shell around this, which sole purpose is to unify access to these different drivers and hardware. And if one of those two api gives access to more types of systems it seems to me that it fulfills that purpose of unifying better.

 

That's certainly true, and it's obvious when you see certain proponents of either API making claims like "this game has better image quality with OpenGL" or "that game sucks because it doesn't use D3D11" that these people really don't know what they're talking about.

 

the big suck usually comes up with Intel chipsets, but they don't really work well in general IME

 

It's dangerous to underestimate Intel.  They've been quietly getting better over the past few years, and a HD3000 or 4000 is actually quite a competent and capable chip.  Even going back as far as something like the 915, they had a SM2 part that was perfectly good enough for lighter-weight rendering work.  A bit of a blip with their first hardware T&L parts, but those days are over.

 

They're not currently competitive with the big guns, of course, but if the trend continues (and all indications are that Intel are serious about and committed to this) then within a coupla hardware generations they're going to have something that just may upset the status quo a little.

Share this post


Link to post
Share on other sites
cr88192    1570


I knew at start this would get into a OpenGL vs D3D debate. And when I see them I ultimately think how pointless they are, because whats actually running all those fancy graphics is the hardware from NV or ATI and the drivers made by NV and ATI and those apis are just a thin(if not thin they would be inefficient) shell around this, which sole purpose is to unify access to these different drivers and hardware. And if one of those two api gives access to more types of systems it seems to me that it fulfills that purpose of unifying better.

 
That's certainly true, and it's obvious when you see certain proponents of either API making claims like "this game has better image quality with OpenGL" or "that game sucks because it doesn't use D3D11" that these people really don't know what they're talking about.


as I understand it, the main thing people do with either API at present is mostly using it to draw lots of big triangle arrays, and deal with textures and shaders (and occasionally render-to-texture and other things).

this stuff should then mostly boil down to the hardware.

there is a lot of the legacy functionality in OpenGL, but much of it is now either deprecated, or absent in GL-ES2, and in my engine most of this has since been moved over to wrappers anyways, and much of the lower-level state is managed by a "shader"/"material" system, ...


the big suck usually comes up with Intel chipsets, but they don't really work well in general IME

 
It's dangerous to underestimate Intel.  They've been quietly getting better over the past few years, and a HD3000 or 4000 is actually quite a competent and capable chip.  Even going back as far as something like the 915, they had a SM2 part that was perfectly good enough for lighter-weight rendering work.  A bit of a blip with their first hardware T&L parts, but those days are over.
 
They're not currently competitive with the big guns, of course, but if the trend continues (and all indications are that Intel are serious about and committed to this) then within a coupla hardware generations they're going to have something that just may upset the status quo a little.


I have a 2009 laptop, and it has an Intel chipset ("Intel GMA" / "Intel Mobile Graphics").
its graphical performance is... not exactly good... along the lines that the newest games it plays well are Quake 2/3 and Half-Life.
Half-Life 2 performance was pretty bad, Portal doesn't work, Doom3 is pretty dismal as well, ...
Minecraft is "barely usable", ...

maybe "Intel HD" is better, dunno, don't have a newer laptop...

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this