Jump to content

  • Log In with Google      Sign In   
  • Create Account


#Actualkuramayoko10

Posted 12 November 2012 - 09:20 AM

Lets put some history facts here as well. I've read many posts in this thread of people that had no idea of how those OS were designed and why they were designed in the first place.

Warning: kind of long post

Back in 1960's at Bell Labs (today's AT&T) the scientists were working on a system called Multics. However this system was growing too large and complex and they decided to forget about it.
Some of the researchers that were working on the Multics (highlighting Ken Thompson and Dennis Ritchie) still wanted a system that would be easier to program and that would be more cooperative (workers could work together on stuff).

Ken Thompson found an old PDP-7 computer, and while still having access to the the Multics prototype, he started rewriting the Multics system for it. There is also a fun story people say that he had played a game on one machine, but it wasn't available on the new ones Bell Labs was buying, so he wanted to rewrite it to a system that could be installed on any machine.
Anyway, this new system he was writing was coined Unics, as a pun for Multics (look for those acronyms on the internet). At first it didn't have any finantial support from Bell Labs since it had no finantial value.

Later it was renamed as Unix because it could support multiple simultaneous users. They worked on adapting it to newer PDP-11 machines and worked on adding a better text interface. At this point they were also designing the C language because they needed an easier to program language (the first unix was completely wirtten in assembly).

Now, unfortunately to Bell Labs they couldn't sell the robust Unix system because of an old antitrust settlement they signed with the government. So they started licesing it to universities. This was already 1970s and Unix was widely distributed in the universities and everyone was enjoying using it.
(For those that doesn't know the power of shell, watch this videoon the release of Unix. The shell is what makes Linux users and programmers so powerful)

Also at this time people at University of California (Berkeley) were branching Unix to what we call Berkeley Software Distribution (BSD).
In the 1980s Bell Labs was dissolved and AT&T could now commercialize the Unix product (it had improved a lot since its first version). And so they did it. They forbid anyone who had it from usng it without paying a very expensive license. This move almost killed the Unix system.
Because of that, many people were mad and one highlighted hacker of the time, Richard Stallman, started the GNU project. GNU is an acronym for 'GNU is not Unix' and it aimed the creation of a complete OS that would be free for distribution.
Later on the Open Software Foundation was created by Stallman and the GNU project had already a drivers, daemons, components but not a stable kernel. This, they say, because he wanted the perfect microkernel.

In 1991, Linus Torvalds wanted to access some Unix files from the university servers and he started woking on a terminal emulator for the machine he had. Because the GNU kernel had not been released, and the FreeBSD, OpenBSD (branches of the BSD branch of Unix) were still in development, he started working on his own OS (just for fun).

Linus says that he always programmed his tools, for example, he started programming in machine code until he found out that existed assembly language and that he could write his own assembler. So he wrote his own assembler...
Anyway, he worked on a monolithic kernel, which is much easier to implement, writing it in C and Assembly and compiling using the released GNU GCC compiler tools.
When he released it (whithout any commercial intention) Stallman liked it and integrated it to his system.
Today Linux kernel can be found on Github and it is the BIGGEST collaborative programming work in computer history, using only free software (Free for Distribution).

Today, a GNU/Linux is a system that uses the Linux Kernel together with the GNU applications and utilities.

Also note that the Mac OS system is not Unix. It started with NextStep which is a mixture of FreeBSD and Mach.
It has a integrated FreeBSD module that follows the Unix standard and so they acquired a license that verifies its Unix-like bit.

Now, for more history on the MacOS vs Windows development, go watch Pirates of Silicon Valley Posted Image

tl;dr
Conclusion: Differently from DOS, Windows, Macintosh and Mac OS, the Linux kernel was never aimed to commercial distribution. Linus didn't develop it so that he could have a company that would sell millions fo copies and make him a billionaire. Naive or not, he did (and still does) an excellent job on the system.

#1kuramayoko10

Posted 12 November 2012 - 09:16 AM

Lets put some history facts here as well. I've read many posts in this thread of people that had no idea of how those OS were designed and why they were designed in the first place.

Warning: kind of long post

Back in 1960's at Bell Labs (today's AT&T) the scientists were working on a system called Multics. However this system was growing too large and complex and they decided to forget about it.
Some of the researchers that were working on the Multics (highlighting Ken Thompson and Dennis Ritchie) still wanted a system that would be easier to program and that would be more cooperative (workers could work together on stuff).

Ken Thompson found an old PDP-7 computer, and while still having access to the the Multics prototype, he started rewriting the Multics system for it. There is also a fun story people say that he had played a game on one machine, but it wasn't available on the new ones Bell Labs was buying, so he wanted to rewrite it to a system that could be installed on any machine.
Anyway, this new system he was writing was coined Unics, as a pun for Multics (look for those acronyms on the internet). At first it didn't have any finantial support from Bell Labs since it had no finantial value.

Later it was renamed as Unix because it could support multiple simultaneous users. They worked on adapting it to newer PDP-11 machines and worked on adding a better text interface. At this point they were also designing the C language because they needed an easier to program language (the first unix was completely wirtten in assembly).

Now, unfortunately to Bell Labs they couldn't sell the robust Unix system because of an old antitrust settlement they signed with the government. So they started licesing it to universities. This was already 1970s and Unix was widely distributed in the universities and everyone was enjoying using it.
(For those that doesn't know the power of shell, go look for a video on youtube on the release of Unix. The shell is what makes Linux users and programmers so powerful)

Also at this time people at University of California (Berkeley) were branching Unix to what we call Berkeley Software Distribution (BSD).
In the 1980s Bell Labs was dissolved and AT&T could now commercialize the Unix product (it had improved a lot since its first version). And so they did it. They forbid anyone who had it from usng it without paying a very expensive license. This move almost killed the Unix system.
Because of that, many people were mad and one highlighted hacker of the time, Richard Stallman, started the GNU project. GNU is an acronym for 'GNU is not Unix' and it aimed the creation of a complete OS that would be free for distribution.
Later on the Open Software Foundation was created by Stallman and the GNU project had already a drivers, daemons, components but not a stable kernel. This, they say, because he wanted the perfect microkernel.

In 1991, Linus Torvalds wanted to access some Unix files from the university servers and he started woking on a terminal emulator for the machine he had. Because the GNU kernel had not been released, and the FreeBSD, OpenBSD (branches of the BSD branch of Unix) were still in development, he started working on his own OS (just for fun).

Linus says that he always programmed his tools, for example, he started programming in machine code until he found out that existed assembly language and that he could write his own assembler. So he wrote his own assembler...
Anyway, he worked on a monolithic kernel, which is much easier to implement, writing it in C and Assembly and compiling using the released GNU GCC compiler tools.
When he released it (whithout any commercial intention) Stallman liked it and integrated it to his system.
Today Linux kernel can be found on Github and it is the BIGGEST collaborative programming work in computer history, using only free software (Free for Distribution).

Today, a GNU/Linux is a system that uses the Linux Kernel together with the GNU applications and utilities.

Also note that the Mac OS system is not Unix. It started with NextStep which is a mixture of FreeBSD and Mach.
It has a integrated FreeBSD module that follows the Unix standard and so they acquired a license that verifies its Unix-like bit.

Now, for more history on the MacOS vs Windows development, go watch Pirates of Silicon Valley Posted Image

tl;dr
Conclusion: Differently from DOS, Windows, Macintosh and Mac OS, the Linux kernel was never aimed to commercial distribution. Linus didn't develop it so that he could have a company that would sell millions fo copies and make him a billionaire. Naive or not, he did (and still does) an excellent job on the system.

PARTNERS