Game development on: Linux or Windows

Started by
45 comments, last by cr88192 11 years, 2 months ago

Use Unity 3D, you can build for PC, Mac, and Linux with the change of an option :)

Advertisement

If your going to use something like C# you should look at MonoGame. You would be able to develop your game on Linux and very easily port it over to windows (In most circumstances I think MonoGame has over 95% code compatibility).

linux is still great to fiddle with for multiple reasons. It is the most used operating system in the high performance computing world. And the most important system to know as administrator of those HPC, administrator of data-center, websites hosting services, administrator of universities IT, some domains/companies with Unix history, hospitals IT service.. etc you name it. Linux is handy.

Next thing, it is great as a student because in the future linux takes a lot of time and has few commercial uses, and even at home it gets tiring. get a wife and children and linux is almost forgotten the first time you see yourself opening a .conf file.

BUT if feels so good to have another engine running the whole damn thing under the hood. it feels neat. windows has very old histerical raisins and some stuff are just so bloated and slow. whereas linux has refactored 100 times to get where it is now. But let's not go down the slope of troll-land.

What Spiro said can not be more true, except Android has its SDK available on windows, linux and OSX. Seems pretty logical, Android is a linux distribution. Google has linux expertise, they are a web company...

A last word about University, if you go to a respectable curiculum with a bit of history and not some new age private school that will give you 100 certifications from Sun, Oracle, SAS, Microsoft and other ridiculous papers that are pure management bullshit, you'll learn the unix way, because computer science originates from there, and widnows has just been re-inventing the wheel in a square shape.

What I mean by that, is that knowing how a linux works (but use debian for that, not ubuntu because you'll see nothing, just cute GUIs..), you'll have a step back and get a more canonical approach to computer science which I believe is great to have when you come back to the windows world.

Not only that, but for university work, you'll have to work with ssh to log on the university servers, and the correct way is from linux. (ssh -X, zsh, csh...)

However, for pure graphics, linux is a PITA if you don't have THE graphic cards for which you COULD have "nice" drivers if you happen to have a distribution that lets you have them. (hint: get an nVidia and don't be afraid of having to have to rebuild your kernel (if you have debian) or just use ubuntu its easier... at least at first.)

There are difficulties to get correct acceleration, and the multiplicity of systems in place in the community doesn't help. (Gallium, DRM, DRI, Mesa, Xorg, Compiz and the driver hell, nouveau, renouveau, fglrx, nv, nvidia and i'm not even talking of the issues with dual screen... that makes me cry)

But its a lot of fun :)

also the compilation process if just so simpler than on windows.

"apt-get install build-essentials" is the only thing you need before you can code.

I don't know how many URLs you have to browse before you can download a compiler on windows, and how difficult it will be to setup all the libraries you need to link correctly with your project.

on linux its often all prepared. you have autotools and cmake , everything is tightly organized in the distribution and libraries install all in the same place, so cmake package finders never loose whereas on windows....

for example, boost library, the most useful libraries set for C++ ever, you guessed it, one apt-get install only before you can use it in your code, in windows you are usually HOURS away. you need to build it, configure the projects .. aaaargh the pain, i can still feel it. fortunately there is a guy who does a binary package for windows but it matches your compiler only if you're lucky anyway.

another thing : Emacs. of course there are Vi people who will want to argue otherwise. But really knowing Emacs (or say Vi) will give you some edge in code-text edition power over the people only knowing IDE like visual studio. (poor guys they don't even know what they loose) Downside, learning emacs is looooong, and difficult, and almost impossible alone. Also it requires knowledge of lisp to edit the unavoidable .emacs config file.

But you probably already know that

just for an ending word, many companies run some servers. these servers are greating running linux for remote administration comfort. So knowing linux.. again a plus. for middle sized companies without admins where anybody can do a bit of admin job from time to time, if you're the only one knowing how to configure iptables and to a little ./configure make make install, you'll have a serious edge in the eyes of the management. particularly when you install an apache server running some django magic with a buildbot along a gitosis service ... or whatever other stuff that are needed in companies. (mediawiki, mailservers, NFS, backups...)

If you go work at some famous big ones out there later in your life:

- google, already mentioned, they notably contribute to webkit

- intel, very active in linux development, because it is easily recompilable they can test lots of their CPU features there. c.f. powertop utility, intel c compiler...

- every single researcher out there. may it be in forest and nature (my sister in law did her thesis on a kind of forest growing model with a demonstration app using C made on unix environment), or computer scientists, biologists, doctors.. c.f vizualisation toolkits like VTK (Kitware Inc.)...

- nvidia, the fermi strategy has lead them on linux because aforementioned HPC reasons.

- IBM, Red Hat, Novell for the most famous.

excellent report that shows that:

http://go.linuxfoundation.org/who-writes-linux-2012

I think the real question here is how deep into code do you want to get. If you want to get into the nitty-gritty of every aspect of your code you are going to end up having to lock-in to a specific platform (i.e. Window, linux, mac, etc.). It is true that it is possible to maintain cross compatibility, but the more complex your game becomes the more difficult it is. Frankly, unless you are planning on someone's platform specific engine, or building your own from scratch I would avoid low-level APIs entirely. Let me throw out a few options to look into that are both very simple to code for, and provide cross-compatability: ShiVa 3D, Unity, Blender. For someone starting out I highly recommend Blender. It is fully integreated, meaning you don't have to use one utility to make objects another to make textures another to manage code and then pull it all together somehow, it's all in one place. It uses Python for it's game language. Lastly, it is completely open-source, and free to download. It isn't meant for crazy complicated games, but for a place to start in game dev it is the only one I can recommend to absolute beginners. They have a few tutorial resources on their site (blender.org), but remember Google and Youtube are your friends.

If you prefer to use Linux/UNIX to develop your game (i.e because you use it as your day to day OS) then you might have some success with wine-g++ and DirectX.
You probably wont be able to use closed source engines with this solution though because I doubt they would be able to link with GCC objects.

Engines I have used that work pretty well on both platforms include include Ogre3D and Irrlicht

I have not had great experience with Unity's Linux support. It only really works on the very latest distributions. RedHat Enterprise 6 couldnt run it due to incompatible glibc versions. This isn't really Unity's fault but is a symptom of using a closed source engine. Linux doesn't really maintain backwards binary compatibility in the way Windows does. (Which is why NVIDIA and AMD's drivers tend to be problematic).

OpenGL works on every platform I have ever used so I always strongly recommend this, even some people tell you that it isn't quite as "good" as DirectX. As an indie developer, it probably wont even make a difference to you. Personally, I find it much easier to get started with.
http://tinyurl.com/shewonyay - Thanks so much for those who voted on my GF's Competition Cosplay Entry for Cosplayzine. She won! I owe you all beers :)

Mutiny - Open-source C++ Unity re-implementation.
Defile of Eden 2 - FreeBSD and OpenBSD binaries of our latest game.
yes, personally I found OpenGL to be a little more accessible than D3D, and the portability is a bit of a plus-point (partly as I develop for both Windows and Linux, and was also considering possibilities like NativeClient and Android as well).

not to say that everything about it is necessarily good, but it works.

I had generally been using the full-featured OpenGL, and also using a fair amount of "legacy" functionality, but trying to migrate things to be able to work with OpenGL-ES is also a little bit of a challenge, mostly as lots of functionality that was taken for granted no longer exists (not all of it for entirely clear reasons), resulting in a fair chunk of the renderer recently being migrated to wrappers (for the most-part, the "fixed function pipeline" is now largely faked via wrappers, errm, partly as wrappers were the path-of-least-effort, and it was admittedly a little easier to move what bits of code that were still using glBegin/glEnd over to wrappers, than decide whether or not to prune them, or rework them to use VAs or VBOs or similar, and likewise went for faking the transformation matrices, ...).

but, in general, it isn't all that bad.


my exposure to D3D has generally been as a bunch of awkwardness involving DX version specific COM objects, PITA getting things linked correctly (since the Windows SDK and DirectX SDK are separate, and it essentially amounts to hard-coding the DX SDK install path into the build-files), and all-around a fair bit of general inconvenience doing pretty much anything, and all this with the code being largely platform-specific anyways, doesn't seem like a good tradeoff.

most of what functionality I have needed, can be found either in OpenGL or in the Win32 API (most of which is wrapped over anyways, via OS-specific code), making these generally a lot more convenient.

advanced rendering features and high-level API design issues aren't really such a big deal in this case.


EDIT, did find this:
http://en.wikipedia.org/wiki/Comparison_of_OpenGL_and_Direct3D

Karsten_, on 09 Feb 2013 - 19:13, said:
Linux doesn't really maintain backwards binary compatibility in the way Windows does. (Which is why NVIDIA and AMD's drivers tend to be problematic).

Actually Linux distributions are binary compatible with eachother and the LSB mandates that they support the old ABIs(Which means they are always backwards compatible), drivers are a different matter since the kernel interfaces change frequently, but that goes for any OS, (Microsoft have changed their kernel interface with almost every single kernel release they've made and it breaks driver compatibility almost every time.

If you want to use proprietary drivers in Linux and avoid problems the only thing you have to do is use the kernel that ships with the OS. (And use an OS that doesn't push out new kernel versions as part of their normal update routine or atleast one that installs new versions of any proprietary driver you're using at the same time)

The fact that Unity3D doesn't work well with RHEL6 has nothing to do with backwards compatibility, glibc is backwards compatible these days(it wasn't back in the 90s, but this isn't the 90s) but old versions of glibc does not magically support software that requires newer glibc versions. (Just like you can't run a game that requires D3D11 on Windows XP).

If you want to run modern software on Linux, do not use RHEL, it is ancient before it gets released(RHEL7 should be able to run games made with the current version of Unity3D). Its great if you need stability, but if Microsoft did like Redhat the latest Windows release would be Windows 2000, SP13 and the only feature updates we'd get would be for things like Hyper-V or MSSQL.

Personally i wouldn't use Unity3D to target Linux today though, if i sell a linux game i have to support it, and offering a unsupported linux client to those who buy the game for Windows or Mac is pretty pointless. Seeing how much problems Unity3D has had to get Android support working reasonably well i'd prefer to wait until others have run over and exposed most of the pitfalls, once that is sorted i might consider supporting Ubuntu and possibly Mint.
[size="1"]I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!

cr88192, on 12 Feb 2013 - 06:47, said:
PITA getting things linked correctly (since the Windows SDK and DirectX SDK are separate, and it essentially amounts to hard-coding the DX SDK install path into the build-files)

Huh?
Adding linker/header paths to the IDE search directories is fairly standard practice and if you have to hard-code anything you’re doing it wrong.
If you are not using IDE’s and using makefiles directly, firstly, that’s just pain you are bringing onto yourself and you have no one else to blame. Secondly you can still use environment variables ($(DXSDK_DIR) would be a good one!) to avoid hard-coding paths.

If the fact that the Windows SDK and the DirectX SDK are separate (as they very-well should be) caused you even the slightest inconvenience, I think you need to gain a bit more experience in general programming, because linking to libraries is a fact of life in the world of programming. The concept of “search paths” exists whether you are using an IDE or raw makefiles, and every programmer should know about this at an early age.

Speaking for myself, my first word as a child was “Mamma”.
My second was “Chocolate cake”.
My third was “Search paths”.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

If the fact that the Windows SDK and the DirectX SDK are separate (as they very-well should be)

Although they aren't any more; June 2010 was the last DX SDK update for DX11.
With Windows 8 DX/D3D is now part of the platform SDK and will be updated (or not) as that is updated smile.png


cr88192, on 12 Feb 2013 - 06:47, said:
PITA getting things linked correctly (since the Windows SDK and DirectX SDK are separate, and it essentially amounts to hard-coding the DX SDK install path into the build-files)

Huh?
Adding linker/header paths to the IDE search directories is fairly standard practice and if you have to hard-code anything you’re doing it wrong.
If you are not using IDE’s and using makefiles directly, firstly, that’s just pain you are bringing onto yourself and you have no one else to blame. Secondly you can still use environment variables ($(DXSDK_DIR) would be a good one!) to avoid hard-coding paths.

If the fact that the Windows SDK and the DirectX SDK are separate (as they very-well should be) caused you even the slightest inconvenience, I think you need to gain a bit more experience in general programming, because linking to libraries is a fact of life in the world of programming. The concept of “search paths” exists whether you are using an IDE or raw makefiles, and every programmer should know about this at an early age.

Speaking for myself, my first word as a child was “Mamma”.
My second was “Chocolate cake”.
My third was “Search paths”.


L. Spiro


many people still build from the command-line using GNU Make, FWIW...
this has the advantage that many core files for the project (much of the Makefile tree) can be shared between OS's.
(vs say a Visual Studio project, which is only really useful to Visual Studio...).


but, yes, although a person can get it linked, it is less convenient given it isn't kept along with all the other OS libraries.
like, in the top-level Makefile, a person may need something like:
export DXSDK="C:\\Program Files..."

as well as passing this back to CL as part of their CFLAGS and similar.

not that it can't be done, but a person can just as easily not do so, and instead just use core libraries (those provided by the Windows SDK for Windows builds, which happens to include OpenGL and the Win32 API, but not typically DirectX).

but, the bigger question is, how worthwhile is it to have a big dependency for something that is largely Windows-specific anyways (and can't really be used to any real non-trivial degree without putting portability at risk)?...

This topic is closed to new replies.

Advertisement