• Advertisement

Archived

This topic is now archived and is closed to further replies.

Distributing for several OS

This topic is 4971 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''d like to know more about how distributing open source projects for different operating systems works. Many projects offer project files for several windows IDEs. Are there purists developing on Win computers who expect something else? I''d like to know more about how it works for Linux/Unix and Mac OS. What do people developing on those platforms expect from decent projects? Custom makefiles? Project files for IDEs? What else is there? I did look at a lot of projects but can''t figure out what certain things do. Any pointers/opinions/explanations would be appreciated a lot. --- If nothing works the way it should ... maybe you need a break!? Get cracking and double-check the switch statements!! Tolop|Andyart.de

Share this post


Link to post
Share on other sites
Advertisement
I''m not quite sure what you''re asking. Most projects I''ve seen have either a VC++ project file, a GCC makefile (or other) or nothing at all.

Generally, the project should be organised in such a way that there aren''t many steps a user has to go through to compile your program.

Mostly projects are organised with the c++ files in a ''src'' folder and the headers in an ''include'' folder. Installation/compilation is mainly adding the include folder to the include path and compiling the cpp files.

Platform dependant things are often separated out with #defines and compiler directives to ensure the program will compile correctl in that environment. For example, VC++ and many windows-based compilers will #define _WIN32

A simple:
#ifdef _WIN32
#include <windows.h>
#endif

will ensure windows headers are included. It''s a very good idea to completely partition off any platform dependent code into separate files/directories. So imagine you were writing a timer - Windows has GetTickCount() which isn''t around on linux. You''d then create a CTimer_Win32 / CTimer_Linux (based on a CTimer interface) class that is totally dependent on that OS. Entry points are also the same, windows can be a WinMain() or main() for DOS/Linux. By carefully crafting the defines, you can ensure that the correct files are included and the correct implementations are created.


#ifdef _WIN32
#include "Timer_Win32.h"
#else
#include "Timer_Linux.h"
#endif

.... in your code

// base class....

CTimer *timer;

#ifdef _WIN32
timer = new CTimer_Win32();
#else
timer = new CTimer_Linux();
#endif

// get time, which will call the relevant code...

timer->getTime();



This is a very trivial example, so don''t shoot me down. It just helped explain what I meant, I think.

Share this post


Link to post
Share on other sites
What I meant is ... it would be possible to share headers and source files only,
then tell how people can start a new project in an IDE, add the files to it and set the project options.
The source can as well come with project files for common IDEs.
That is more convenient for users of the library or program.

I''d like to know from other developers what a good project should offer.
Especially from people who don''t use compilers like Dev-C++ and VC++

---
If nothing works the way it should ... maybe you need a break!?
Get cracking and double-check the switch statements!!
Tolop|Andyart.de

Share this post


Link to post
Share on other sites
Most things i''ve seen have offer packages for linux and compiled binaries for windows. This can be done since most windows systems are fairly homogenous. For linux, there are are many different flavors of linux and not everybody comes standard with the required libraries and such, so you can''t assume dynamically linked libs are there when they might not be.

For linux/UNIX systems, usually source distributions of software come in .zip or .tar.gz "tarballs". They usually include a configure script which creates a makefile, then the makefile of course. Ideally, a linux user should just be able to unzip and type

./configure
make
make install

and that oughta do it. If you want to distribute software in a *nix environemnt, you should really learn about how to set those up. Anything that doesn''t come with a configure/make script doesn''t get my time.

For windows, i usually find projects to come with VC or Dev-C++ project files. They should just unzip and compile as well.

Share this post


Link to post
Share on other sites
Thanks. Things like that are what I need to know.
If I''d set up a project with KDevelop (not sure if I''ll ever succeed) ...
isn''t a configure script created automatically?
Would that only work for people with the same setup that I have?

Share this post


Link to post
Share on other sites
I''ve never used KDevelop and i''m no GNU MAKE expert, so i''m not sure what it has to offer. I think what most people do (quite frankly) is rip a configure script from some other project, change a few things, and use that. Sorta like we all do with our own code. Very rarely is it all from scratch.

What ./configure tries to do is find out if you meet all the requirements to build the project. It checks if you have the right compiler, right libs, files in the right place, permissions, all that. If everything checks out, it automatically creates a makefile for you when it runs. Then you run the makefile which is (more or less) guaranteed to work since it was created with configure and not made by hand.

If configure fails, it (hopefully) will tell you why. like "Oops! You don''t have SDL installed. Go install it and try again."

Of course, in linux, you could (and should) bind up your compiled project in an .rpm (RedHat) or .deb (Debian) package for easy pre-compiled installation, besides the source package. Just don''t ask me how :-)

Share this post


Link to post
Share on other sites

  • Advertisement