Linux Frustration

Started by
17 comments, last by Sander 18 years, 2 months ago
I downloaded Ubuntu Linux a few days ago, and had been using SuSE 9.1, and in both cases I couldn't get the 'make' command to work. I've researched how to install software onto Linux, but it seems to involved an elaborate voodoo ritual of ./configure, make, and other commands, all while checking poorly written documentation for some information on how to make it all work. In the course of installing the Anjuta IDE I've had to install an obscene number of files from the gnu.org site (autoconf,automake,glib,libtools...), none of which I can build, and when I just try to use gcc from the command line on a "hello world" program I get a couple of hundred errors. Questions: 1) Is there some package I have to install/magic words I have to type just to get 'make' to work, and if so why (...) wasn't it installed by default? 2) Is there some design issue that precludes Linux programs from installing themselves? Aside from .rpm files I can't find any examples of this, and I've had mixed results from Ubuntu's repository feature.
"Think you Disco Duck, think!" Professor Farnsworth
Advertisement
(sudo) apt-get install build-essential

Debian (and therefore Ubuntu) doesn't ship with the GNU compiler toolkits because most people don't need them everyday. If you want to use any libraries, you'll want to apt-get install library-dev before you attempt to build your program, so that the headers get into the right locations.

EDIT: You'll find a lot of software using the APT system is simple enough to install with a simple "apt-get install program-name". (if you're not root and don't have admin privilidges, you need to use "sudo apt-get install program-name", which will ask you for your password). However, unlike Windows, Linux does not intrensically have a software install feature; Linux Distributions get around this by rolling their own (Apt, RPM, or simply tar-gz files), or adopting one that's already in place (like Ubuntu did).

But, Ubuntu also ships with a tool called Synaptic to make installing packages through the GUI easier.

Think of the apt-system as Linux's Installishield.
The ./configure bit means you're running a configuration script. The configuration script sets up the application's makefile for your specific system. The make part will just build the first target in the makefile.

When you do things with make, you're compiling the software from source for your machine. You can just get precompiled binaries installed more easily the way ciroknight says.

1) In most distributions "make" comes pre-installed. In some distributions it doesn't, when the distribution is trying to abstract everything from the user so it can be "easier" and more noob friendly. The problem is: Linux is a complete geek mess (this phrase may cause some flaming, but I am a Linux fan and this is the truth). While some distributions will make everything abstract, clean and simple, if you want to move out of it (Linux, no matter what, makes you want to do this), you have to reconfigure everything. Linux is the best operating system in terms of software availability, user support and code base (IMO). However, it is the most incompatible OS I have ever seen. Linux will be a pain, get that written down. If you don't do it the Linux way, you won't find an easy way out.

Linux is an OS where you have full control. You have kernel configurations, desktop enviroments and applications to choose from. Having a distribution that abstracts the purpose and soul of Linux means you have to stick to what the distribution (and it's community) offers. Most of them offer precompiled libraries and files (.rpms, .deb, Portage tree, etc.), clean and simple. If you want to go somewhere away from what that distribution offers, you can. It's what Linux is all about: alternative choices (that's what people love about Linux and that's what people hates about Linux). The problem is most distributions abstract things so much you don't have a clue of what "choices" it did.

The solution is always the same. Internet. Linux has a gigantic community. You WILL find help, all you have to do is search. Distributions mostly offer a starting place (like with precompiled binaries). So check your distribution's forums on package management and package installation (this is how the distribution abstacts it's software).

The most dominant form of installation is the traditional "./configure && make && make install" which basicly means, configure the scripts so it can compile on my system (and report if that is even possible, sometimes libraries are missing...), compile it, copy it into my system. Could be plain and simple, but it depends if your system was configured to support it. (That was why I chose Gentoo, I can compile and monitor everything without going through all the Linux From Scratch trouble (read: repetive typing)).

2) As I said above, check Gentoo. You only install what you want (which means you have to install a lot to get it working the way you want to). Read the Linux From Scratch book. You do learn a lot. But if all you want is a stable system without the geekeness that makes linux fun, you just have to find the right distribution and install the packages you want :)

Bottom line is: Linux is an aquired taste. You have to get in a lot of errors and bugs to learn what is happening and figure out how to manage the OS and the computer. Remember: internet is your friend (as is linuxquestions.org). Hope this helps,

JVFF

PS: After I finished typing I realized this has a lot of personal opinions and rants. While this means I can have got a lot of things wrong and possibly it disagrees with perspectives, feel free to criticise. I need to learn more :)
ThanQ, JVFF (Janito Vaqueiro Ferreira Filho)
If the make command is required to install programs I'd say it's pretty essential, thanks for the help, I got g++ working at least.
"Think you Disco Duck, think!" Professor Farnsworth
Quote:Original post by Horatius83
If the make command is required to install programs I'd say it's pretty essential, thanks for the help, I got g++ working at least.


Make isn't essential to installing most programs, and thusly neither is all of support libraries/programs to make such as GCC and G++.

Also, you're talking about a huge set of software to support a tiny fraction of software that ships uncompiled at least with the two distros you mentioned. Gentoo, like that other user mentioned, is a self-compiled operating system, which does install GCC (and in fact compiles it with a binary shipping version) and Make.

But really, all you have to do is apt-get install build essential on debian, and everything should be in place. If you head over to the Ubuntu forums, they can tell you specifically what all is included in the package; I can't off the top of my head because that's how I typically set it up myself if and when I ever have to set up my machines (which I haven't done in over a year now).
When I used slackware (a few years ago) I remeber installing packages the old way:

# tar -xvzf package-name-x.x.x.tar.gz
# cd package-name-x.x.x
# ./configure [options goes here]
# make

// did all go well?...
# echo $?
# 0

// install system wide...
# make install

Oh boy, did I get into version conflicts after a few days :D
Thank god there is installation managers available like apt-get (debian, ubuntu), emerge (gentoo) to check dependencies, and install/update/remove/whatnot all the extra packages needed to satisfy all the dependencies.

PS. If you want to install the make utility, I think it is called automake.
# apt-get install automake

[Edited by - pulpfist on January 26, 2006 12:35:52 AM]
Quote:Original post by ciroknight
Also, you're talking about a huge set of software to support a tiny fraction of software that ships uncompiled


Tiny fraction? I find that the *vast* majority of software I install on Linux (I've tried various flavors of Redhat, Suse, and Ubuntu) must be compiled from source. As far as I can tell Linux is pretty much by-design binary-incompatible even with itself and the people that control the base system like it that way because it forces people to release source despite the immense pain it causes for non-technical users.

I give Linux an honest try every few years. I install some likely-looking distro and start trying to get actual work done. It's always the same. The install goes pretty well for the basics and then I spend 3 days per non-trivial app to get software installed because of the dependency explosion and mutually incompatible libraries, window managers, etc, etc. Everytime I get told that the latest cool package manager solves all the problems but it never does.
-Mike
I've been using KDE on Fedora Core 4 without any major headaches for month or so now. Mind you I still have my Windows XP laptop next to me. I've also been through a few Linux installs in the past, so I tend to wander down fewer blind alleys.

The comment about "by-design binary-incompatible even with itself..." smacks of conspiracy. It's nothing like that. Linux software is made by whoever wants to. That means there's a huge wide range of installation choices for the developer. Any given developer might not even care if you ever get their software working on your machine. If you don't like that then stick to software that you find easy to set up. Here you use forums and trial & error in lieu of the marketing messages from commercial software.

I try to take the easy options when I can these days. If a tool works, I don't worry much about whether it works _precisely_ the way I'd have built it. I just use it and focus on the job I'm trying to use that tool for.
Quote:Original post by Anon Mike
Quote:Original post by ciroknight
Also, you're talking about a huge set of software to support a tiny fraction of software that ships uncompiled


Tiny fraction? I find that the *vast* majority of software I install on Linux (I've tried various flavors of Redhat, Suse, and Ubuntu) must be compiled from source. As far as I can tell Linux is pretty much by-design binary-incompatible even with itself and the people that control the base system like it that way because it forces people to release source despite the immense pain it causes for non-technical users.


Funny, I use Linux every day. I work as a developer for a major Linux distro. The only time I ever have to configure/make/make install is when I'm working on code I'm writing. And I use a lot of software.

I come from decades of Unix development where configure/make/make install and hand-editing config files was the norm (as it was in Windows up to Win95). It is no longer the case. Any worthy distro, and most of them are, are in fact easier to install, configure, and use than is Windows.

The only excuse people have these days for installing from code on Linux is you're trying to impress chicks. Trust me, it doesn't work. Try apt-get install personality instead.

Stephen M. Webb
Professional Free Software Developer

This topic is closed to new replies.

Advertisement