Sign in to follow this  

Linux Frustration

This topic is 4324 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I downloaded Ubuntu Linux a few days ago, and had been using SuSE 9.1, and in both cases I couldn't get the 'make' command to work. I've researched how to install software onto Linux, but it seems to involved an elaborate voodoo ritual of ./configure, make, and other commands, all while checking poorly written documentation for some information on how to make it all work. In the course of installing the Anjuta IDE I've had to install an obscene number of files from the gnu.org site (autoconf,automake,glib,libtools...), none of which I can build, and when I just try to use gcc from the command line on a "hello world" program I get a couple of hundred errors. Questions: 1) Is there some package I have to install/magic words I have to type just to get 'make' to work, and if so why (...) wasn't it installed by default? 2) Is there some design issue that precludes Linux programs from installing themselves? Aside from .rpm files I can't find any examples of this, and I've had mixed results from Ubuntu's repository feature.

Share this post


Link to post
Share on other sites
(sudo) apt-get install build-essential

Debian (and therefore Ubuntu) doesn't ship with the GNU compiler toolkits because most people don't need them everyday. If you want to use any libraries, you'll want to apt-get install library-dev before you attempt to build your program, so that the headers get into the right locations.

EDIT: You'll find a lot of software using the APT system is simple enough to install with a simple "apt-get install program-name". (if you're not root and don't have admin privilidges, you need to use "sudo apt-get install program-name", which will ask you for your password). However, unlike Windows, Linux does not intrensically have a software install feature; Linux Distributions get around this by rolling their own (Apt, RPM, or simply tar-gz files), or adopting one that's already in place (like Ubuntu did).

But, Ubuntu also ships with a tool called Synaptic to make installing packages through the GUI easier.

Think of the apt-system as Linux's Installishield.

Share this post


Link to post
Share on other sites
The ./configure bit means you're running a configuration script. The configuration script sets up the application's makefile for your specific system. The make part will just build the first target in the makefile.

When you do things with make, you're compiling the software from source for your machine. You can just get precompiled binaries installed more easily the way ciroknight says.

Share this post


Link to post
Share on other sites
1) In most distributions "make" comes pre-installed. In some distributions it doesn't, when the distribution is trying to abstract everything from the user so it can be "easier" and more noob friendly. The problem is: Linux is a complete geek mess (this phrase may cause some flaming, but I am a Linux fan and this is the truth). While some distributions will make everything abstract, clean and simple, if you want to move out of it (Linux, no matter what, makes you want to do this), you have to reconfigure everything. Linux is the best operating system in terms of software availability, user support and code base (IMO). However, it is the most incompatible OS I have ever seen. Linux will be a pain, get that written down. If you don't do it the Linux way, you won't find an easy way out.

Linux is an OS where you have full control. You have kernel configurations, desktop enviroments and applications to choose from. Having a distribution that abstracts the purpose and soul of Linux means you have to stick to what the distribution (and it's community) offers. Most of them offer precompiled libraries and files (.rpms, .deb, Portage tree, etc.), clean and simple. If you want to go somewhere away from what that distribution offers, you can. It's what Linux is all about: alternative choices (that's what people love about Linux and that's what people hates about Linux). The problem is most distributions abstract things so much you don't have a clue of what "choices" it did.

The solution is always the same. Internet. Linux has a gigantic community. You WILL find help, all you have to do is search. Distributions mostly offer a starting place (like with precompiled binaries). So check your distribution's forums on package management and package installation (this is how the distribution abstacts it's software).

The most dominant form of installation is the traditional "./configure && make && make install" which basicly means, configure the scripts so it can compile on my system (and report if that is even possible, sometimes libraries are missing...), compile it, copy it into my system. Could be plain and simple, but it depends if your system was configured to support it. (That was why I chose Gentoo, I can compile and monitor everything without going through all the Linux From Scratch trouble (read: repetive typing)).

2) As I said above, check Gentoo. You only install what you want (which means you have to install a lot to get it working the way you want to). Read the Linux From Scratch book. You do learn a lot. But if all you want is a stable system without the geekeness that makes linux fun, you just have to find the right distribution and install the packages you want :)

Bottom line is: Linux is an aquired taste. You have to get in a lot of errors and bugs to learn what is happening and figure out how to manage the OS and the computer. Remember: internet is your friend (as is linuxquestions.org). Hope this helps,

JVFF

PS: After I finished typing I realized this has a lot of personal opinions and rants. While this means I can have got a lot of things wrong and possibly it disagrees with perspectives, feel free to criticise. I need to learn more :)

Share this post


Link to post
Share on other sites
Quote:
Original post by Horatius83
If the make command is required to install programs I'd say it's pretty essential, thanks for the help, I got g++ working at least.


Make isn't essential to installing most programs, and thusly neither is all of support libraries/programs to make such as GCC and G++.

Also, you're talking about a huge set of software to support a tiny fraction of software that ships uncompiled at least with the two distros you mentioned. Gentoo, like that other user mentioned, is a self-compiled operating system, which does install GCC (and in fact compiles it with a binary shipping version) and Make.

But really, all you have to do is apt-get install build essential on debian, and everything should be in place. If you head over to the Ubuntu forums, they can tell you specifically what all is included in the package; I can't off the top of my head because that's how I typically set it up myself if and when I ever have to set up my machines (which I haven't done in over a year now).

Share this post


Link to post
Share on other sites
When I used slackware (a few years ago) I remeber installing packages the old way:

# tar -xvzf package-name-x.x.x.tar.gz
# cd package-name-x.x.x
# ./configure [options goes here]
# make

// did all go well?...
# echo $?
# 0

// install system wide...
# make install

Oh boy, did I get into version conflicts after a few days :D
Thank god there is installation managers available like apt-get (debian, ubuntu), emerge (gentoo) to check dependencies, and install/update/remove/whatnot all the extra packages needed to satisfy all the dependencies.

PS. If you want to install the make utility, I think it is called automake.
# apt-get install automake

[Edited by - pulpfist on January 26, 2006 12:35:52 AM]

Share this post


Link to post
Share on other sites
Quote:
Original post by ciroknight
Also, you're talking about a huge set of software to support a tiny fraction of software that ships uncompiled


Tiny fraction? I find that the *vast* majority of software I install on Linux (I've tried various flavors of Redhat, Suse, and Ubuntu) must be compiled from source. As far as I can tell Linux is pretty much by-design binary-incompatible even with itself and the people that control the base system like it that way because it forces people to release source despite the immense pain it causes for non-technical users.

I give Linux an honest try every few years. I install some likely-looking distro and start trying to get actual work done. It's always the same. The install goes pretty well for the basics and then I spend 3 days per non-trivial app to get software installed because of the dependency explosion and mutually incompatible libraries, window managers, etc, etc. Everytime I get told that the latest cool package manager solves all the problems but it never does.

Share this post


Link to post
Share on other sites
I've been using KDE on Fedora Core 4 without any major headaches for month or so now. Mind you I still have my Windows XP laptop next to me. I've also been through a few Linux installs in the past, so I tend to wander down fewer blind alleys.

The comment about "by-design binary-incompatible even with itself..." smacks of conspiracy. It's nothing like that. Linux software is made by whoever wants to. That means there's a huge wide range of installation choices for the developer. Any given developer might not even care if you ever get their software working on your machine. If you don't like that then stick to software that you find easy to set up. Here you use forums and trial & error in lieu of the marketing messages from commercial software.

I try to take the easy options when I can these days. If a tool works, I don't worry much about whether it works _precisely_ the way I'd have built it. I just use it and focus on the job I'm trying to use that tool for.

Share this post


Link to post
Share on other sites
Quote:
Original post by Anon Mike
Quote:
Original post by ciroknight
Also, you're talking about a huge set of software to support a tiny fraction of software that ships uncompiled


Tiny fraction? I find that the *vast* majority of software I install on Linux (I've tried various flavors of Redhat, Suse, and Ubuntu) must be compiled from source. As far as I can tell Linux is pretty much by-design binary-incompatible even with itself and the people that control the base system like it that way because it forces people to release source despite the immense pain it causes for non-technical users.


Funny, I use Linux every day. I work as a developer for a major Linux distro. The only time I ever have to configure/make/make install is when I'm working on code I'm writing. And I use a lot of software.

I come from decades of Unix development where configure/make/make install and hand-editing config files was the norm (as it was in Windows up to Win95). It is no longer the case. Any worthy distro, and most of them are, are in fact easier to install, configure, and use than is Windows.

The only excuse people have these days for installing from code on Linux is you're trying to impress chicks. Trust me, it doesn't work. Try apt-get install personality instead.

Share this post


Link to post
Share on other sites
$ apt-cache search personality
honeyd - Small daemon that creates virtual hosts simulating their services and behaviour
honeyd-common - Honeyd's honeypot documentation and scripts
libnet-ftpserver-perl - A secure, extensible and configurable Perl FTP server
linux32 - Wrapper to set the execution domain


Ok, that's just weird.

Share this post


Link to post
Share on other sites
Quote:
Original post by Bregma
Funny, I use Linux every day. I work as a developer for a major Linux distro. The only time I ever have to configure/make/make install is when I'm working on code I'm writing. And I use a lot of software.

I come from decades of Unix development where configure/make/make install and hand-editing config files was the norm (as it was in Windows up to Win95). It is no longer the case. Any worthy distro, and most of them are, are in fact easier to install, configure, and use than is Windows.

The only excuse people have these days for installing from code on Linux is you're trying to impress chicks. Trust me, it doesn't work. Try apt-get install personality instead.


Which specific distros are you talking about? In XP if I want to install a program I just download the installer and double-click, if I can do that in Linux that would be wonderful.

Share this post


Link to post
Share on other sites
[quote]Original post by Horatius83
Quote:
Original post by Bregma
Which specific distros are you talking about? In XP if I want to install a program I just download the installer and double-click, if I can do that in Linux that would be wonderful.


With the Xandros distribution, for example, I can just download the package and double-click. I can also use a the included package manager to grab the package from wherever and install it without having to find and download anything. That only works with Microsoft packages on Windows. I couldn't, for example, grab and install the latest updated driver for my PCCard reader using Microsoft Update. I can with Xandros Networks.

On the other hand, I'm a command-line kinda guy (a picture is worth a thousand words. The command line is worth as many or as few words as I want). I usually install from the command line, simply using the package name. I am not familiar with anything in Windows that gives me that power and flexibility.

I believe similar facilities are available using Ubuntu, Novell/SuSE, Fedora, Mandriva, and others.

Share this post


Link to post
Share on other sites
Quote:
Original post by Bregma
With the Xandros distribution, for example, I can just download the package and double-click. I can also use a the included package manager to grab the package from wherever and install it without having to find and download anything. That only works with Microsoft packages on Windows. I couldn't, for example, grab and install the latest updated driver for my PCCard reader using Microsoft Update. I can with Xandros Networks.

Well, I use ubuntu, and it's every now and then that I'm looking for a specific application that's not so common, and I can't find it with apt. Or I find it, but it references a package that doesn't exist (happens sometimes). Then I have to go wandering for it, only to find a Fedora rpm. The rpm depends on a bazillion other rpms, and so I keep hunting for a good while...

The system's good when it works, but it doesn't always do. I prefer Windows' consistency: It's always a single method. That makes it much easier for the developers and the users alike.

If both methods would be combined, that'd be a dream come true. On Windows, InstallShield are kinda trying this with their Updater, but it won't work the way they're heading currently, IMO. On linux, I don't see binary compatibility and a unified package distribution method happening either, unfortunately.


Quote:
On the other hand, I'm a command-line kinda guy (a picture is worth a thousand words. The command line is worth as many or as few words as I want). I usually install from the command line, simply using the package name. I am not familiar with anything in Windows that gives me that power and flexibility.

Microsoft Shell (MSH) for you. Scripting on steroids, with full .NET CLR access.

Share this post


Link to post
Share on other sites
Have you tried to actually use MSH? I'm not impressed yet. It's a very steep learning curve (from where I sit at least), and it seems to me like they're reinventing a lot of functionality that's already been done.

On Windows, I like the XP and 2000 command line or WSH. CMD has been greatly enhanced compared to earlier versions. Most of the new functionality immitates (or improves on) comparable stuff in bash or other unix shells. WSH, on the other hand, lets you write jscript, vbs or javascript to access a lot of Windows functionality or ActiveX/OLE/COM stuff.

MSH will replace it I guess, but I don't see it as a great tool yet.

Share this post


Link to post
Share on other sites
Quote:
Original post by Metaphorically
On Windows, I like the XP and 2000 command line or WSH. CMD has been greatly enhanced compared to earlier versions. Most of the new functionality immitates (or improves on) comparable stuff in bash or other unix shells.

Well, improving on comparable things in other shells is a good thing, isn't it? I've always used cygwin's bash for my scripting needs - CMD never cut it.

Quote:
WSH, on the other hand, lets you write jscript, vbs or javascript to access a lot of Windows functionality or ActiveX/OLE/COM stuff.

Unfortunately, I dislike jscript, vbscript and javascript, so I've never been a big fan (though I do use it when I have to).

Share this post


Link to post
Share on other sites
Quote:
Original post by Horatius83In XP if I want to install a program I just download the installer and double-click, if I can do that in Linux that would be wonderful.


Why would you want to do that? One thing Linux newcomers need to get used to is the centralized package management. But, in my opinion, this is much safer than actually downloading any software from anywhere on the web and just installing it, not knowing what it actually does. In most cases, I agree, it's not a problem. But in some cases, it is (spyware, virus, you name it).

Now, when you look for a specific package that can't be found in the distribution's package tree, you're very likely to find it somewhere else. For Ubuntu specifically, there are lots and lots of repositories with unofficial/non-free/unsupported software.

And if you still can't find the package pre-compiled, you can of course download the source and install it manually. And that most software is only available for Windows is certainly not Linux' fault. A good example for a binary installation is Quake III IMHO, so they do exist.

I certainly agree with you that one has to get used to this way of package management. But once I got the idea, I got completely hooked and find it much more consistent than software management in Windows. For four years now I've been using Linux (virtually) exclusively, and I don't regret it.

Cheers,
Drag0n

Share this post


Link to post
Share on other sites
Quote:
Original post by Drag0n
Quote:
Original post by Horatius83In XP if I want to install a program I just download the installer and double-click, if I can do that in Linux that would be wonderful.


Why would you want to do that? One thing Linux newcomers need to get used to is the centralized package management. But, in my opinion, this is much safer than actually downloading any software from anywhere on the web and just installing it, not knowing what it actually does. In most cases, I agree, it's not a problem. But in some cases, it is (spyware, virus, you name it).

What about distribution rights? If I make a closed source application that I want to run on linux, and want to distribute it myself, I have to setup a package repository myself for every major package management system (AFAIK, correct me if I'm wrong). I don't like this from both the user and developer perspectives:
- The user has to add the repository of each vendor they use software from. On Windows, I have software from a *lot* of different vendors, and it certainly wouldn't be fun to hunt for their repositories when all I need from them is a single program. This is actually more work than with the unified-installer model because here I had to:
1) Go to their website
2) Add their repository
3) Install the software from the package manager

With a unified installer I just do #1, download the installer and that's it.

- The developer has to support a variety of package management systems, on top of the support for the various linux distributions. When you go to Opera software to download their browser, you're given the choice of a large number of distributions to choose from so that you'd get the package suited to yours. That's inefficient, IMO. It's basically like you're supporting different OSes with different releases even though they use the same kernel. Why not unify things a little more?

Quote:
Now, when you look for a specific package that can't be found in the distribution's package tree, you're very likely to find it somewhere else. For Ubuntu specifically, there are lots and lots of repositories with unofficial/non-free/unsupported software.

The non-free/unsupported repositories are rarely up-to-date, in my experience. Which makes sense, because they're not managed by the original vendor. Which means I still have to bypass the package system and go to the vendor's website to get the latest.

Quote:
And if you still can't find the package pre-compiled, you can of course download the source and install it manually. And that most software is only available for Windows is certainly not Linux' fault.

I disagree with this. On Windows, it is much easier to develop and deploy an end-user application that'd magically work across almost all Windows versions - a lot of stuff goes into this, from the availability of powerful tools (VS, for example) to binary compatibilty across Windows versions, to a unified deployment model.

Share this post


Link to post
Share on other sites
Quote:
Original post by Muhammad Haggag
What about distribution rights? If I make a closed source application that I want to run on linux, and want to distribute it myself, I have to setup a package repository myself for every major package management system (AFAIK, correct me if I'm wrong).


No. Just provide a package on your site. Most repository managers have a way to install separate packages as well. A tar.gz, rpm and deb package should give you plenty of coverage across the various linux systems (about all distro's minus Gentoo and deratives -- and I think even Gentoo can handle rpm's and tar.gz's)

If you don't care about the distribution rights, but just the closedness of the source (e.g. a closed source freeware program) you could still try to get your package added to the repositories of the major distributions (the non-free one's for example).

I think that for developers, the difference between the windows EXE/installer way and the Linux package way isn't that big (unless you also open the source in which case you'll deal with upstream contributers). For end users, installing a program from a non-trusted third party is slightly more difficult than doing the same in Windows (a good thing IMHO) while installing applications from a trusted and known source is much easier than on windows.

Share this post


Link to post
Share on other sites
Sign in to follow this