Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Arild Fines

Why Bitkeeper(or the linux model) Isn't Right For Free Software

This topic is 5214 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Through my interest in version control tools, I came over this article*. It''s primary purpose is to debate why the BitKeeper model of version control(distributed repositories/propagation of changesets in a pyramid model) is wrong for typical open source projects, and it specifically discusses why the development model used for the linux kernel is wrong. For the linux kernel, every patch has to be filtered through the primary maintainer for the branch in question - if the maintainer doesn''t like the patch, it''s out. The author is a Subversion developer, and the article is obviously biased by that, but it still raises some good points. Some of the problems the author sees as caused by flaws in the linux kernel development model are:
For Linux, the consequences of these limitations have been slow and unpredictable release schedules, poor stability of release branches, and a lack of important standards (for instance, no consistent kernel module ABI or even API within a release branch)
I also thought this was a pretty interesting consequence of linux'' god-like-maintainer model:
For instance, the mainline Linux kernel does not contain a kernel debugger because Linus won''t allow it. "I don''t think kernel development should be ''easy''. I do not condone single-stepping through code to find the bug."
In contrast, the author uses the Mozilla, BSD and Apache projects as examples of successful projects that use a more open model(and they all use CVS), in which a group of committers are collectively responsible for the health of the repository and every one of those can approve and commit patches. The "no kernel debugger because I don''t like them" scenario would most likely not happen in such an environment. So - what do people think? It''s hard to argue with the success of linux, but is it really a good way to do open source software? [*] http://web.mit.edu/ghudson/thoughts/bitkeeper.whynot -- AnkhSVN - A Visual Studio .NET Addin for the Subversion version control system. [Project site] [Blog] [RSS] [Browse the source] [IRC channel]

Share this post


Link to post
Share on other sites
Advertisement
I don''t agree with him that the ABI/API is inconsistent because of the version control system used. I think it''s inconsistent simply because the developers are more interested in improving the methodology and think of it as a more important issue than consistent ABI/API. (I''m not trying to say this isn''t an important and highly debatable development issue, whether it''s a good or bad decision, or anything on those matters.) I don''t see how further distributing the "load" of version control like the classic CVS-like usage (CVS, Subversion, et cetera) would help the matter much either.

About the kernel debugger: it actually is in Linus''s repository for a couple architectures now (in 2.6.2). But, that''s mostly irrelevant on the actual topic of the example: who gets to decide what code gets merged?

In terms of getting features added faster, the classic CVS-like usage model probably would help: instead of dividing the maintainership of code over the purpose of the file it would be divided up over the in a "stratified" manner, allowing developers with more finely tuned purposes (someone implementing a debugger everywhere, or someone reimplementing locking everywhere, et cetera) to get code in faster. However, having the accountability of a "master developer" is nice .

I would agree that introducing the aspect of parallel repositories to maintain a patch probably induces needless burden on their maintainers. Not all features are worthy of inclusion, but something must be done about those that are so that they''re more readily usable/maintainable. It''s possible that the CVS-like usage would help this because of the potential general streamlining of single-feature/multiple-realm merges.

All of that kind-of raises a possibility in my mind: why not approach it both ways and have pairs of maintainers for each piece of a file. There would be the "as-is" system of maintainers for "realms" of the code (filesystem, network drivers, et cetera) and then another layer of maintainers that can make changes pertaining to single features in all of the realms. I''d guess that it''d be harder to manage this with the "god-like-maintainer" model that''s there now, which is part of the complaint in the first place. Since this is a more oligarchical approach, the CVS-like usage would indeed seem more apt to handle it. It seems to me that''d take advantage of both the "watchful and careful head maintainer" (who''d see and manage, if not necessarily hand commit, the changes to their realm of the respository) aspect as well as the "get little things in fast" goal. But I''m just rambling now...

Share this post


Link to post
Share on other sites
Interesting article. The most interesting thing, as I see it, is that Linus Torvalds so totally dislikes debuggers and other ''helpful tools''. Torvalds states that you should be ''careful'' instead of relying on a debugger to catch the logical mistakes that you make. That may be true to some extent, since you _should_ think things over before you code them.

I guess this way of thinking works if you have insane programming skills and an immense understanding of computers in general. But, how many people do have these skills? It may seem egoistic to not allow kernel debugging in the mainline, but on the other hand people can use debugging tools just as they like in other versions of the kernel.

The remark ''I don''t think kernel development should be easy'' is in itself stupid and childish even if one of the most respected software developers around have stated it. I guess there is some context where this remark is sound, but I have yet to find that context.

Share this post


Link to post
Share on other sites
Interesting article. I didn''t know that Linus had absolute control over what makes it into the mainline branch and what does not. You have to wonder how much Linus is questioned -- generally from what I read he makes good decisions, but he is human. His "no debugger," and "kernel programming is supposed to be hard," comments reek of adolescent-elitism. And it is ridiculous that there is no standard ABI, actually.

People (actually, just /.) wonder why Linux is slow to catch on. Perhaps when it takes itself less seriously then others will start taking it more seriously.

Share this post


Link to post
Share on other sites
quote:
Original post by Grul
Torvalds states that you should be ''careful'' instead of relying on a debugger to catch the logical mistakes that you make. That may be true to some extent, since you _should_ think things over before you code them.

I guess this way of thinking works if you have insane programming skills and an immense understanding of computers in general. But, how many people do have these skills? It may seem egoistic to not allow kernel debugging in the mainline, but on the other hand people can use debugging tools just as they like in other versions of the kernel.


Let''s be fair though; the Linux kernal is an immensely mission-critical piece of software. You really don''t want substandard programmers playing around with it. Nor would he, as the guy who takes ultimate responsibility for it, want to have to sift through patches from people who don''t really know what they''re doing but are able to maintain a pretense of productiveness through the aid of tools. Most of the top code quality books will tell you that reliance on a debugger makes for worse code. You should be able to single-step through code in your head. After all, if you don''t know what effect a line of code will have, why did you put it there?

The attitude works for kernel development but I agree would be arrogant if you were writing an email client or something. You need different mindsets for developing different kinds of software. A kernel is definitely not a place where democracy makes any kind of sense.

[ MSVC Fixes | STL Docs | SDL | Game AI | Sockets | C++ Faq Lite | Boost
Asking Questions | Organising code files | My stuff | Tiny XML | STLPort]

Share this post


Link to post
Share on other sites
quote:
Original post by Kylotan
A kernel is definitely not a place where democracy makes any kind of sense.

But aren''t the BSD''s generally held to be of higher quality than linux?

Share this post


Link to post
Share on other sites
quote:
Most of the top code quality books will tell you that reliance on a debugger makes for worse code. You should be able to single-step through code in your head. After all, if you don''t know what effect a line of code will have, why did you put it there?


What a load of bollocks.

It''s possible to use a debugger without becoming reliant on it. "single stepping in your head" is utter crap. The entire reason you might use a debugger is because the model in your head does not match -- for whatever reason (compiler bugs, misunderstanding of the hardware, anything, really) -- what''s actually happening.

Particularly with re-entrant kernels and the issues that arise as a consequence, it''s extremely hard to accurately single-step "in your mind". It''s a complicated dynamic system with scores of degrees of freedom. A debugger allows one to examine this dynamic state.

Not using a debugger simply makes life harder for oneself than it ought to be.

As for the quality of the coders, that entire argument seems specious. It doesn''t matter if the coders are "substandard"; it matters if the code is correct. How that correctness is achieved -- mental single-stepping, use of a debugger, divine inspiration -- is irrelevant.

Share this post


Link to post
Share on other sites
quote:
Original post by Arild Fines
quote:
Original post by Kylotan
A kernel is definitely not a place where democracy makes any kind of sense.

But aren''t the BSD''s generally held to be of higher quality than linux?

Yes, and it''s probably true, too.

Share this post


Link to post
Share on other sites
quote:
And it is ridiculous that there is no standard ABI, actually.

They don''t want to encourage binary-only drivers.

Share this post


Link to post
Share on other sites
quote:
Original post by Arild Fines
quote:
Original post by Kylotan
A kernel is definitely not a place where democracy makes any kind of sense.

But aren''t the BSD''s generally held to be of higher quality than linux?

The BSDs are generally thought to be of a higher quality, yes. However, that is just popular opinion. Better to learn for yourself than to take a vote.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!