#12: Consistent Packaging

Wednesday, November 1st, 2006

A long, long time ago, packaging was an exciting idea. There were disputes over style and process, there was innovation. There were reasons to prefer .deb over .rpm over emerge and its binary packages…

Today, these differences are just a hindrance. The fact that there are so many divergent packaging systems in the free software world (and I include the various *bsd’s) is a waste of time and energy. We want to focus the collective brainpower of the community on features and bugs, not on packaging. I would like to see the LSB renamed to the FSB – the “Free Software Base”, and get buy-in from the *BSD’s, and then I’d like to see us define distribution-neutral packaging that suits both the source-heads and the distro-heads. Then there’d be sufficient rationale for the relevant upstreams to include that packaging framework in their revision control repositories, and distro patches would become far more exchangeable.

Ubuntu isn’t built on secret sauce in the packaging. We don’t think our patches should be hoarded – we mail them all to Debian daily anyway, and publish them as best we can on the web the instant they are uploaded, often before.

Packaging is also one area where we can definitively improve on the real user experience for most people who treat computers as a job not a passion. It’s a strategic tool in the battle between proprietary and open approaches. I often think that the proprietary software world’s way of distributing software is one of its biggest weaknesses – an Achilles Heel that we should be exploiting to the full extent possible. I’m often asked why Linux can’t make it easy to “write something like Microsoft Installer, or Installshield”. That’s the wrong rabbithole, Alice. Linux can make it so when you dream of databases, PostgreSQL or MySQL are “just there” and “just work”. That’s a much nicer experience – we should make the most of it.

130 comments:

  1. GrammarNazi says: (permalink)
    November 1st, 2006 at 10:56 am

    it’s → its
    Heal → Heel
    rpms still don’t seem to handle dependencies well…
    And I can’t preview here.

  2. kerneljack says: (permalink)
    November 1st, 2006 at 11:07 am

    I agree with you that the disputes over .deb and .rpm are sort of pointless now because you can install apt-get on Suse or urpmi on Mandrake for example, and get functionality similar to apt-get for rpm-based distros.

    Your proposal for a “packaging framework” is something that I myself have been thinking about as a way to solve the problem that you cannot simply take a package for “Gaim” and move it across distros, from Ubuntu to Suse for example. Why should it matter what the underlying packaging system is, be it rpm or deb? I think autopackage does some of this to some extent, but i haven’t personally tried it.

    When you say the “Installshield” method is the wrong rabbithole, I agree to some extent; probably new computer users will not immediately go looking for Installshield. You didn’t discuss the OSX approach though which I believe is much better and extremely simple. It’s something like:

    1. A website publishes a new version of “something.dmg” and it says clearly it is for OSX 10.4 Tiger, and there is usually another one for 10.3 Panther. You simply download the one you want.
    2. You double click on it, it unzips to your desktop and is ready to go! You *may* move it to you Applications folder if you feel like, you don’t have to.
    3. If you need to move it to another computer, just zip it up, and move it and unzip it and it’s ready to go.

    I’ve been wondering why no one is pursuing this on linux because it’s the ultimate in simplicity (for me anyway).

    I do realize that to do this realistically on linux, you will have problems across distros like *glibc* is the wrong version or the package depends on something else which depends on something else. I never have this problem on OSX which I think is probably because you are simply supposed to target a single version of it, let’s say 10.4 and you immediately know which libraries, etc will be there and which won’t, so it is easy to create a package.

    I think linux needs something like this, sort of a baseline which packagers can target for their packages, and they will be guaranteed to work across distros. Your package framework and the LSB or FSB are probably required to reach this goal.

    Some people will say that having a repository is even better because if it’s not in the repository you can just grab it, but it really annoys me that if something is *not* in the repository then I have to hunt around for the *right* package (rpm or deb), then make sure the architecture is right, or maybe even edit /etc/apt/sources.list and risk breaking my system because of tons of unofficial unsupported sources.

    NOTE: I use OSX, Ubuntu and Windows and try to help out on some free software projects as well. I just used OSX as an example here because I think it handles installing / removing programs very well.

  3. Dave says: (permalink)
    November 1st, 2006 at 11:07 am

    Achilles Heel

  4. Rob... says: (permalink)
    November 1st, 2006 at 11:21 am

    The advantage of MSI/Installshield is that the user can easily upgrade one particular piece of software.

    e.g. How do I put Firefox 2 on Breezy?! or Vim 7…

  5. thebluesgnr says: (permalink)
    November 1st, 2006 at 11:40 am

    Rob, you should really complain to Mozilla. They should be providing a package called “mozilla-firefox” (so it doesn’t conflict with Debian and Ubuntu’s “firefox”). Also an rpm for SUSE and Red Hat.

    As it is, their “tar.gz” doesn’t integrate at all with the system; it doesn’t add an entry in the applications menu, it doesn’t integrate with the MIME system, and obviously it doesn’t integrate with the package system. Mozilla could solve that by providing three files (rpm, deb, tarball) that would install on most distributions.

    Anyway, if you ever installed Skype on Ubuntu you’ll see that gdebi is much better than installers, the problem is that software developers still seem to ignore that they should be providing debs and rpms.

    About Mark’s post, I think it’s a quite interesting idea but you’re never going to convince everyone to switch to the same format. However, you can help that RPM and DEB, the two most widely popular ones, get supported.

    I remember when most GNOME packages shipped with an rpm spec file and the debian/ dir, but since they were a) unmaintained and b) often conflicting with the distribution’s packages, they were removed from practically all packages. That idea didn’t work.

    Something cool related to this though, is openSUSE’s build service. Basically, it helps you distribute packages for several distributions without having to install each of them. (it’s easy to ask Mozilla to provide debs and rpms, but most Free software developers don’t have the resources to build packages for more than one system – the one they use).

    Oh, and it’s a great idea to rename the LSB to FSB, but the people behind it seem convinced that “Linux” is a complete operating system.

  6. Michael Anckaert says: (permalink)
    November 1st, 2006 at 12:14 pm

    Mark,

    Just this one post alone shows how intelligent your view on OSS and GNU/Linux is. You wrote down the words I’ve tried to explain to dozens of friends and people when they complain about the way Linux distributions handle software installations.

    Ubuntu and the community are lucky to have such a dictator as yourself ;-)

  7. Valif says: (permalink)
    November 1st, 2006 at 12:21 pm

    > The advantage of MSI/Installshield is that the user can easily upgrade one particular piece of software.

    Jeez… It’s *much* easier (as in one command or one mouse click, as you prefer) using virtually any today Linux distro.

  8. Eamon says: (permalink)
    November 1st, 2006 at 12:34 pm

    I could not agree more. However as technical director for a mid sized windows focused organisation I keep on banging my head against the wall whenever we try to give Linux a chance. We operate a school, a head office and around 50 home workers. We currently use windows with a range of OSS as well as propriety payroll and accounts software (windows lock in).

    Our current state of Linux play – a number of dual booting school machines, a group of old lap tops with edubuntu and two internet kiosk machines at our HQ. For the school a lack of something like install shield is a nightmare. If its too old, new or just not in a repository its not installable by our end users. Also not all our machines are internet connected. I love the idea of a great super repository however I also want the option of one simple file that just installs a program when needed – something like autopackage.

    This and a couple of other issues stop us moving over a large number of our machines and paying you for support – simplicity of third party software installs, system lock down and configuration (complexity), dial up control (really really poor with some iffy drivers and still essential for remote workers), wine configuration/usage (rocket science), and X recovery/grub configuration when imaging Linux to disparate platforms.

    BTW I use ubuntu edgy on my X40 as I find it more stable, reposive and reliable than window (esp. and an old install) but that configuration is not subject to much change and like our HQ use pretty much web browsing only.

  9. huh says: (permalink)
    November 1st, 2006 at 12:45 pm

    dpkg -i :-s or add the repository for firefox 2 and apt-get install :-s

  10. Justyn says: (permalink)
    November 1st, 2006 at 12:54 pm

    It would be incredible if free software distributions could share packages properly as you’ve described.
    But would such a packaging system require a completely new package format. Would an existing project, like autopackage, be sufficient to build on?

  11. Tim says: (permalink)
    November 1st, 2006 at 1:31 pm

    Hear hear. I couldn’t agree more. Packaging needs to be made both universal and easy. I’d like to see a packaging tool, perhaps an eclipse plugin, that would automate a lot of the details of software packaging. But I agree that making it universal is the key so that packaging effort is share among all distributions.

  12. Florin says: (permalink)
    November 1st, 2006 at 1:32 pm

    “Achilles Heal”? Maybe “Acchilles Heel”: http://en.wikipedia.org/wiki/Achilles'_heel

  13. Joseph Smidt says: (permalink)
    November 1st, 2006 at 1:33 pm

    Mark,
    You couldn’t be more correct. The open source community really needs to be united on packaging not divided. How much time and effort is unfortunately lost because the same package has to be packaged several different ways. Even worse, when somebody builds a package say in .rpm format with security fixes it isn’t immediately ready to be used by those who needs .deb format. Then more time and effort is spend putting that fix in several other formats. Or new users get confused knowing which format to download for their setup and it is a headache for them to use linux. It is really a waste. I hope you continue to push this.

  14. Treenaks says: (permalink)
    November 1st, 2006 at 2:22 pm

    That’s also its weakness. You’ll have to check for security updates yourself. And upgrade all applications separately. Last weekend I had to upgrade about ten separate tools on my dad’s Windows machine. 9 security upgrades, too.. but because they’re 3rd party applications, they won’t be in Windows Update.

    In most Linux distributions, security-updated applications just appear.. as long as the packager knows about them (and the source).

  15. Simone Brunozzi says: (permalink)
    November 1st, 2006 at 2:54 pm

    Dear Mr. Shuttleworth,
    you are definitively on the right track here.
    In my experience, both for pro or home users, packaging is something that affects the whole computer experience.
    You could be the right person to unite all the free software world efforts on packaging.
    Good luck!

  16. Matt says: (permalink)
    November 1st, 2006 at 3:05 pm

    Absolutely right.

    I tried out Ubuntu back in it’s Breezy days, and switched to it as my main OS when Dapper launched. Since then I’ve tried to convert several other people, and honestly – the reason they wouldn’t is because installing sometimes/often involved compiling, which was just too difficult.

    The group we’re looking to draw from are coming from Windows, where double clicking a .exe, and 3 clicks later, you have the software installed. I think a major hurdle is the fact that dealing with binary and source code is far to difficult (either in truth or in their eyes). If there could be a universal package that could be as easily installed as .deb’s now in Ubuntu, that would be amazing. Another alternative if possible would be to build a package or script into the operating system that could take tarballs and identify them as binary or source, and install them for you.

    Honestly I see this as one of the biggest problems Linux faces.

  17. F. H. says: (permalink)
    November 1st, 2006 at 3:33 pm

    If a new packaging framework that truly works across the entire universe of Linux and *BSD distributions is indeed feasible, would we still need to have new “versions” of each distribution? Couldn’t we instead simply upgrade our applications as and when we need to, and have the underlying packaging system transparently upgrade all utilities, libraries, and the kernel, as necessary, on demand? I’d go even further: why have numbered versions (“5.04,” “5.10,” “6.06″…) if we could have just a single, final version of “Ubuntu for X86,” “Ubuntu for PPC,” and so on?

    If this were indeed possible, from the point of view of end-users like me it would mean that, instead of being *forced* to think, “I want Firefox 2, therefore I must upgrade to Ubuntu 6.10,” we could instead think, “I want Firefox 2, so let me install it. Period.” Ubuntu would then take care of upgrading the rest of my computer as necessary.

    Just a thought…

  18. C.J. says: (permalink)
    November 1st, 2006 at 3:39 pm

    Me, too™

    rpm(8), dpkg(8) and apt(8) should be wrappers around the same core, IMHO. The old-timers are used to their tool and will become frustrated if they don’t do what they’re expected, but that’s not enough reason not to make change. The front ends folks are used to can continue to be provided and hooked to an alternate back end. Sure, there will be small discrepencies, but that’s easy enough to get over.

    Mark, you’re much better at getting something like this off the ground than I… Who should we talk to in order to get some traction here?

  19. fish says: (permalink)
    November 1st, 2006 at 3:47 pm

    Well, yes…of course…

    I know at least 15 (!) friends who tried linux and ditched it again…why? *All* of them (I’m not kidding!) said that installing software on linux is a pain in the ass…and yes…they are right! Not having a common packaging system after more than 15 (!!!) years of Linux is a fucking evidence of incapacity…you can tell me whatever you want, but the MS/Mac ways are just the easiest…do I care if a package is linked statically or not? No, I don’t give a damn shit…99% of the users who want to switch to linux don’t even know what linking dynamically means…and they don’t care…and they are right not to care! All they want is an easy way to install software on whatever distribution and again, tell me whatever you want, hail apt-get and urpmi and whatever as much as you like, they are not as easy as on windows…all of them just lack! Yes…give us an easy way to install packages…or better, give the new users an easy way to do it…goddamn, it’s the 21st century and sometimes (way too often!) we still have to compile our packages ourselves…it’s like WTF? That’s just retarded, sorry…

  20. fish says: (permalink)
    November 1st, 2006 at 4:05 pm

    Hm, me again…want an example? New Amarok…1.4.4…NOT available for Kubuntu Dapper…go to kunbuntu.org and check for it…no packages! It created huge disappointment of Dapper users in the Amarok IRC chan…I know new software versions are not part of LTS but this is not the issue here! For instance, log on to the IRC chan and ask the Amarok devs about linux packaging systems and its way to distribute software…they’re pissed off about it just like me and most of them would tell you the same as I just did above…go there and ask them if you don’t believe me…argh. Anyway, back to my studies…

  21. Vincent says: (permalink)
    November 1st, 2006 at 4:47 pm

    In my opinion, the best solution would be some package management that could handle rpms and debs (and perhaps ebuilds?) next to eachother. I don’t know if it is possible, but it would be the solution, because packages that already exist would not have to be replaced.

    I still don’t get the point of the Smart Package Manager (http://labix.org/smart) though…

  22. Jack Frost With A Nipple Nose says: (permalink)
    November 1st, 2006 at 5:43 pm

    In my opinion:

    Most people are so used to the “Windows way” of doing things, they equate the Windows experience with how computers should be used, and when trying anything different their fingers recoil in disgust because they are forced to try to adapt to something different. But as [Ubuntu] Linux becomes more popular, and more happy Linux gurus are open and willing to show how easily programs like Synaptic work, things will become easier. I believe most of the problem can be overcome not by changing things so radically, but by happy and gentle introductions.

    The main problems with Linux adaption, I believe, are not with pretty desktops and package management, these things are improving but not at the root of the issue. I feel the main problems today are:

    1. Lack of marketing: Face it, Windows advertisements are everywhere. Where are the Linux advertisements? Where is the push to let people know they don’t have to buy their operating system anymore? Where are the attempts to redefine the meaning of the word FREE in the public’s mind? Most of them equate free with being cheap or less than a set standard of quality. We need to show them that free is good where Linux and open source is concerned.

    2. Closed Tech (DirectX for example): We need to push for open hardware developed for use with open code, that’s all there is to it. The graphics/gaming companies are locked into DirectX, and this shouldn’t be. This needs to change in order for real progress to spring forth in graphics/multimedia/gaming in open source. We want free and open drivers (where Nvidia and ATI aren’t open currently, right?) but these big two are developing for DirectX, right? This needs to change. We need to draw companies away from DirectX into an open solution. Hardware needs to be open. Frankly, I’m surprised Microsoft hasn’t switched from PC to another new platform (but I read something in the news about M$ starting to make chips, and if that’s true, who knows where that will lead) which would lead to new breeds of hardware and more headaches. Sure, in time specs can be found, but look how long (and still how frustrating) Winmodems in all their incarnations and variations have frustrated people in Linuxland. We need to have open hardware just as we have open source. We must not settle or reward companies for their closed development any longer. When you buy an Nvidia card or ATI, who are you rewarding? Aren’t they getting their cards to work for DirectX? Therefore, money goes to M$ through all of that doesn’t it? So aren’t you in some way, by buying Nvidia and/or ATI, rewarding M$ even if not directly through your purchase? Dual booting is bullshit, no one should have to use DirectX for gaming, didn’t Bill Gates say something about creating a need a long time ago, making people need you and/or your products? I may be wrong, but I swear I heard a quote by him several years ago regarding this and it ties into my point, we need to show people that they need free and open source FREEDOM and not to RELY on a corporation’s products and feel they wrongly have only one choice.

    2a. Closed file formats: The public needs to be informed about the free and open file formats, they need to know they have a choice and don’t have to use closed video/audio players (Real/QuckTime/M$MediaPlayer) but can author and playback in different formats without bowing to beastly solutions.

    This all just barely scratches the surface, but to you Mark I humbly submit this message.

  23. behavedave says: (permalink)
    November 1st, 2006 at 5:56 pm

    I really don’t see how LSB is going to solve all these problems because having a complete standard base will never suit the incremental constantly evolving nature of linux. Imagine trying to make even small changes with an LSB to keep, vendor disagreements would mean Linux development would stall especially with changes every six months. The best way I see is a more robust approach and unfortunately a more similar approach to the proprietary method similar to PC-BSD’s PBI’s or even Gobolinux’s self contained applications. Sure it won’t really help big changes like upstart but it will side step disagreements over libraries as you are allowed all different versions of libs.

  24. Roger Binns says: (permalink)
    November 1st, 2006 at 6:16 pm

    Ultimately the problem is binary packaging. Windows has excellent binary compatibility from Windows 98 through Server 2003 which is why it is possible to provide a single Windows installer. If you are going to allow for diverging Linux platforms, then that means different versions of C libraries, cpus and other installed modules (eg ldap, kerberos).

    The solution would need to be something which is source based (that shouldn’t be a problem for open source software :-) You can still have precompiled binary packages that can be fetched by the packaging system if they are suitable, otherwise they have to be “compiled”.

    Also bizarrely rpm lacks some stuff that even the SVR4 packaging system has such as installf/removef which allow you to update the packaging database after installation. That is extremely useful in making the package adapt to the system. For example I have wanted to use it to decide at install time if man pages should be plain, gzip or bzip and update the package database with whichever is appropriate.

  25. Rick says: (permalink)
    November 1st, 2006 at 6:33 pm

    I think you’re absolutely right about not following in the footsteps of install shield. Plus the Microsoft philosophy of delivering software and updates is completely broken, i.e. Microsoft only shipping updates to their software over their infrastructure and not opening it and it’s api’s up to third parties. This leads to each individual program having to write their own update mechanism which is obviously madness.

    In this regard the Linux distro’s are much better in that lots of (often competing) packages are managed within a distro through it’s packaging system, though obviously some fall through the holes.

    I’d love to see a unified packaging/deployment system for the FLOSS OS’s, but it’s a big job and finding something that will suit everyone is going to be very hard if not impossible.

  26. Fafek says: (permalink)
    November 1st, 2006 at 7:30 pm

    I totally agree! Unfortunetly, Linux has a great variety of problems like that.

    For example, war between GTK+ and QT, Gnome and KDE. Why PSI integrates well only with Kubuntu desktop? There is no GTK+ and QT, there is only Linux! Why GNOME users are discriminated against their environment? Why Firefox looks ugly on KDE and doesn’t integrate with it?

    Openoffice.org use it’s own mechanism for displaying GUI and for operations like printing. Effect is following: it doesn’t look well anywhere. It’s sick situation too. Applications should use OS’ native widget toolkits. Cross-platform programs should use GTK+ on GNOME, QT on KDE (I still believe that is will be solved), Cacoa on Mac OS X and Windows API on Windows.

    Next issue – many applications for same thing which aren’t compatible with themselves. It should be easy to use Evolution today and Thunderbird next week with same data without need to export/import anything. Currently, there is no way to import mail from Evolution to Thunderbird without IMAP servers and such crap.

    Mozilla and GNOME lack of essential features. Community asks for multiple signatures support in TB for years without any results. The best example for tiny feature which is a big problem for GNOME developers – Num Lock startup state. Simple checkbox would be enough, now you have to INSTALL third-party numlockx. Very funny, isn’t it?

  27. JayMiller says: (permalink)
    November 1st, 2006 at 9:32 pm

    A standard package format sounds great! But I favor a format that allows standalone operation and/or incorporation into a larger network-based system. Particularly if the goal is to win over the vast majority of Windoze users. Then the answer must be to have a package system that supports both Installshield-style packages and repository style package management.

    For the Empire, Windoze Update manages the core OS packages/CABs/whatever, while third parties manage their apps via Installshield et al.

    In the Ubuntu world, Synaptic is capable of managing core OS and third party apps. That’s a double-edged sword though. Certainly for Canonical it makes sense because they provide support for the apps in their repository. However, it makes more sense for third parties to simply provide one package that end users can either download and install (a little revenue for them if they decide to offer support), or that can be incorporated into a repository-style package manager by various distros.

    This helps burgeoning projects that don’t get picked-up by repositories, and helps ease the inclusion into distros as their popularity increases.

  28. jose hevia says: (permalink)
    November 1st, 2006 at 10:09 pm

    You are here. But there are two problems here, one is the user perspective that you point, but another is developer perspective. Developers for Linux needs to know:
    gentoo packages, debian, rpms, autopackage (all completely different)
    so he has to do 4 times the work if there were only one, but it won’t be.
    In Windows there is Installshield that makes it easy for him to make a install package.It’s very easy and graphical!. So if Linux want to compete needs a graphic tool for dummies that can do:
    -All the install hard work so programmers can focus on programs.
    -Generate some STANDARD intermediary project code.
    -Gererate different package output automatically via Plugins.

    This way programmers only have to work once. Thanks to LSB this is a real possibility. My 0.02

  29. El Diablo en los Detalles | La Buena Idea del Día says: (permalink)
    November 1st, 2006 at 10:16 pm

    [...] Ganador: Mark Shutleworth « « Dr. Scrybe (O como aprendrí a dejar de preocuparme y a amar la Web 2.0) | [...]

  30. Andy Tai says: (permalink)
    November 1st, 2006 at 10:37 pm

    They have put too much investments into their port systems so there is little chance they will join a “free standard base.”

    In the context of GNU/Linux, this is probably not a relevant matter.

  31. Mike Hearn says: (permalink)
    November 1st, 2006 at 10:52 pm

    Hey, it’s too bad you left early at LUGradio Live this year. My talk was all about this, along with related topics like usability and security around software distribution. C’est la vie.

    If you want, email me or we can set up a call. We (the autopackage team) have been working on and thinking about these issues for *many* years and have long been beating the drum for distro-neutral packaging. But we never had buy-in from the distros. I’d love to see that change. BTW I don’t know your schedule but if you are in Mountain View before December 22nd let me know.

    thanks -mike

  32. Jan says: (permalink)
    November 1st, 2006 at 11:29 pm

    Guys, packaging is quite a bit more than taking a binary, zipping it up in whatever archive format you prefer and slapping it into a repository. I happen to make some packages from Mandriva and I have packaged software for Sharp Zaurus (Linux-based PDA). I have also tried to build debs in the past.

    The problem is not the packaging format – they all can manage equally well. We can argue on details, but rpm or deb or whatever is the fad of the day does the same job reasonably well already – tracks dependencies. This is something that e.g. Microsoft installers still are unable to do – the “Install->Next->Next->Finish” clickfest is just a glorified way how to unpack a “zip” file into destination, nothing more. People comparing rpm with apt-get are off the mark – rpm is equivalent to dpkg on Debian/Ubuntu, not to apt-get. That’s where you get things like urpmi (Mandriva), yum (Redhat) or smart which do what apt-get was designed to do.

    Where the real problems start is when I want to make an rpm or deb for several distros. Apart of obvious issues with dynamic libraries (to Fish – your users will certainly start to care if I give them 100MB of download for a statically linked binary instead of the usual 1-10MB – e.g. Amarok as you seem to suggest!), the main problems are with how different distros do things:

    1) How are dependencies named? E.g. Mandriva calls a development package with headers for a library ‘Foo’ libfoo-devel-… Debian/Ubuntu calls the same libfoo-dev. Redhat has yet different naming, Suse as well. What should my package which needs this dependency use? I cannot possibly check for all cases.

    2) Where to put menu entries and which menu system to use? Again, Mandriva 2007 uses XDG menu, Mandriva 2006 and older used the Debian menu system. I am not sure what Ubuntu uses now, but the older releases didn’t use XDG, AFAIK. So if I want my package to integrate with mime types and menus, I have to support all possible cases. Not to mention that category names and submenus are different on each, so if I put something in a folder which is fine on Mandriva, Ubuntu people will yell at me that the entry is wrong and vice versa.

    3) Where is e.g. KDE installed? Suse used to put it in /opt/kde*, Mandriva puts it in /usr, I guess we could find still more possibilities.

    4) What are the conventions for init scripts on the particular distro? E.g. Ubuntu now uses upstart which is completely different from the older System V init scripts. How am I supposed to be compatible with that and the other distros still using init? Mandriva has parallel init now, which is yet another case.

    And many, many other issues like this. Moreover, plenty of software is not packaged straight out of the upstream tarballs – it needs to be patched (e.g. for branding, security or just because it doesn’t build properly!) as well. And I am not yet talking about systems which compile their binaries from source – e.g. Gentoo or BSDs, internationalization issues (some is provided by 3rd-parties), QA issues – not everybody has a nice patron as Ubuntu has a has to make the ends meet by other means – e.g. by selling a well polished system, that implies heavy QA which will not let you integrate the latest-and-greatest easily.

    If you want to see what sort of issues are involved and what has to be considered, have a look at Mandriva packaging guidelines:
    http://qa.mandriva.com/twiki/bin/view/Main/PackagingIndex

    Things like autopackage are not really a solution – that is like trying to chase wind because the distros are a moving target. In the end, it will work with just a few major distros and the rest is still where it was (or worse off). Zipping everything up and delivering a self-contained archives as Mac OS .dmg-s – we have this already – Klik (http://klik.atekon.de/) However, that will not give you the integration as with the natively packaged applications.

    I think that the only solution for portability of packages between various distributions is to
    get LSB to do what it was supposed to do – standardize. Agree on a common set of libraries, tools, locations which should be present on a compliant system. Then make distros follow it (they would, they did with the former LSB – it has advantages). There is some progress being made on this front, but do not expect to be able to take a Mandriva RPM and install it on Ubuntu any time soon.

    However, if you standardize, then you also have a drawback – e.g. if the standard says that you have to ship KDE 3.3 compatible libraries, you may have major difficulties shipping your distro with e.g. brand new KDE 3.5 as these are not really installable in parallel without major kludges … So I am not sure whether people clamoring the “latest and greatest” applications could have both their cake and eat it …

    Remember, things are not just black and white and the obvious and simple answer is rarely the right one.

    Regards,

    Jan

  33. Jan says: (permalink)
    November 1st, 2006 at 11:51 pm

    Oh, and before somebody brings up the favorite argument that there should be just one “standard” Linux – think of a situation when that “standard” is not the distro/setup you are using but e.g. something like Slackware or Linspire or something similar you have never used and doesn’t suit your taste (both Slackware and Linspire are fine systems, BTW). And you have no other choice, because there is only one Linux managed by e.g. a large committee or some foundation where you, the user, have little say. How would you like that?

    This is the case with both Windows and Mac – either you are happy with what you were given by the vendor or tough luck – you cannot really change anything. Of course, in such situations it is easy to provide “simple” installers which work – Windows XP didn’t really change in 5 years and Apple has one release per year? You cannot compare this with a typical Linux distribution which has 6 months release cycle and it is still not fast enough for some people.

    Jan

  34. JJS says: (permalink)
    November 1st, 2006 at 11:51 pm

    OK fine. What are your specs? Who do you expect to create the packages, distros or project developers? If you want to leave it up to project developers, how do you handle unresolved dependencies (ie. there is no guarantee that the right graphics lib will be on all systems). Are you going to distribute source or binaries? Who is responsible for packaging and testing updates (most projects do not have the resources for this)?

    There are a lot of pieces to the puzzle, which is why the distros exist in the first place. And the distros came up with the packaging schemes. However, this is not the first post about the issue, and in fact there are several projects on SourceForge to try and produce a common Linux installer. If a major distro put its weight behind one of them, there is a possibility that others might join as well.

    However, if there is only 1 package management system that handles all of the dependency issues, what value does a distro offer?

  35. Well says: (permalink)
    November 2nd, 2006 at 12:13 am

    Well, sorry to point that out, but pkgsrc, originating from the NetBSD project does exactly what you want. And it is available on linux, and the other BSDs, etc… And hey, it’s NetBSD – these guys don’t do anything but writing portable code, and code that provides the same services on a huge number of architectures, etc… I don’t understand the problem here. You cite the BSDs two times in your article…. and you say “various” BSDs…… I think there are way more Linux distributions spreading chaos that do make things like the LSB necessary. On the other side, there are 4 big BSDs that just work together, and they focus on particular things and don’t juste make a thing different using the same kernel, because…. hey, that’s nicer that way. Instead of putting one deb over one rpm, and then introducing apt… why can’t you just get it right the first time ? And instead of crying out “we need a FSB” without even knowing that the problems you cited don’t even exist in the BSD world, you really should look at pkgsrc, and so on – it seems to me that you just would like to impose a new standard without noticing that you could use existing stuff – non GPL stuff, unbloated stuff, extremely portable stuff, old school unix style stuff that fits in the original architecture, coherent stuff. But well – since the linux guys think that BSD guys are elitist morons with big egos, I understand that you try to sell your “I do it for the community” image, but the community already has the solutions you obviously don’t see.
    Sorry for offending you. ;)

  36. Chase Venters says: (permalink)
    November 2nd, 2006 at 12:14 am

    Absolutely!

    I’ve been thinking about this lately. There ought to be a standard ‘Software Package’ XML format, with intelligent use of namespace (like the Dublin Core extensions) to describe packages.

    Think about it – you could publish a _signed_ XML package descriptor on your website, that has references to the various available package tarballs (some binary and a source). A distribution could basically take that XML package description and use it to fashion their own package from it – the XML description should be able to automatically convert to an ‘rpm’ or ‘deb’ for example.

    There should be hyperlinks in the XML descriptor to XML descriptors of dependencies, and those hyperlinks should include SHA-256 checksums of the target dependency XML descriptor for accuracy verification.

    Just some of my random thoughts…

  37. Robert Smit says: (permalink)
    November 2nd, 2006 at 1:14 am

    I would suggest that there are a couple of different levels of install needed:-

    1) whole package in a single directory/file (eg for Trial install); there are already tools available for this and run as an almost self contained virtual system. This would be good for trying the software before installing it (eg new version of an application before replacing current version). This would not affect any other parts of your system, it would be secure having no write access to anything but it’s self and should run straight from the file for convenience. It could also contain dependencies not part of current LSB to increase likelihood of just running straight from file.

    2) Install from file; this is covered by RPM and is needed for systems not connected to internet/network or were not available in a repository.

    3) apt style internet repository; this would be the most convenient. It should be easy to setup repositories on private network and/or CD/DVD. If easy enough to set up repository files on LAN or local drive it could do the job of point ’2′ above.

    These different levels of install should be designed to work together. For example tools for packaging an application should generate everything needed for all 3 install levels. The Ideal package should contain everything needed to be used in all 3 levels of install but this may not be practical.

    Imagine being able to run a program straight from the package and then when happy with committing a proper install. Imagine being able to do this regardless of whether you downloaded the file your self or through apt-get/synaptic.

    The really hard part would be making it possible to do all this with source files as well as pre-compiled binaries.

    This is a tall order but it would cover just about everybody’s needs and beat any packaging system out there.

  38. Pascal Bleser says: (permalink)
    November 2nd, 2006 at 2:23 am

    Sorry Mark and everyone who posted up to now, but this is just an extreme oversimplification and futile discussion without going into some real-world technical details.

    There are different things that have to be considered, not just the package format (that’s actually the easiest part, by a large margin).

    Linux distributions do differ in many regards when it comes to integration
    - incompatible init profile schemes: chkconfig and /etc/init.d/rc*.d (LSB) e.g. on SUSE and Redhat, outdated /etc/rc*.d and update-rc.d (non-standard) on Debian/Ubuntu, …
    - incompatible init scripts (in /etc/init.d/*): e.g. both SUSE and Redhat use LSB, but the scripts differ a lot (different tools (service on Redhat, startproc/killproc on SUSE)
    - LSB not implemented on each distribution (e.g. no “lsb_release” binary by default on Ubuntu)
    - different paths: /opt/gnome and /opt/kde3 on SUSE, /opt/kde3 on Mandriva, /usr for KDE+GNOME on most of the others; /var/www for the Apache DocumentRoot on Debian/Ubuntu, /srv/www/htdocs for the same on SUSE, …
    - different package names: huge differences between, amongst many others, the typical SUSE/Fedora/Redhat scheme (zlib/zlib-devel), the Debian/Ubuntu scheme (e.g. libz/libz-dev), the Mandriva scheme (include major version numbers in library packages), etc…; and that’s a major issue for specifying dependencies in packages (e.g. RPM’s Requires/BuildRequires)
    - different tools for system tasks: different init implementations, different inetd implementations (though almost every distro should be using xinetd by now), different syslogs (syslog, syslog-ng, …), …
    - package versions differ greatly: latest GNOME on latest Ubuntu, GNOME 2.12.0 on SUSE 10.1, … (although that’s probably rather an issue with GNOME itself, where almost every GNOME application requires the very latest bleeding-edge release of GNOME libs)

    wrt that, the package format and the package managers are pretty easy to solve from a technical point of view
    But who’s going to abandon his specificity here ? Would Debian/Ubuntu switch to RPM ? Would SUSE/Fedora/Redhat/Mandriva switch to dpkg ? I just don’t see that happening.
    Develop a new format and a new toolchain ? That’s probably the worst approach as the current ones are proven and tested since many, many years. And it’s far from being trivial, don’t forget you also need a toolchain to create those packages. Package management is really the most critical subsystem of a Linux or *BSD distribution.

    As with the MacOSX dmg approach, there’s something very similar for Linux (klik) but both have shortcomings in the light of different Linux distributions:
    - the “base system” (a set of packages that e.g. klik “assumes” is installed) differs a lot and is hard to determine (do you draw the red line before or after, say, the KDE and GNOME libraries ?)
    - as written in one of the comments: take the .dmg, put it “anywhere” and just click… how does that work for server software like Apache, MySQL, PostgreSQL, etc… ? Where is the configuration ? And how would you realistically want to help people on mailing-lists and IRC ? Where’s the path to that binary, where’s the path to that config file ? “anywhere” ? inside a compressed dmg/cmg/whatever image ? no chance
    - those .dmg images are rather easy to build and distribute because each is made for *one* specific version of MacOSX; this is just totally not comparable to Linux, with lots of different distributions (and all the differences I wrote about, above); how is that an improvement in terms of end user experience with this: you go to a website, see an RPM file for e.g. SUSE Linux 10.1, click on it, some MIME handler passes it to the package manager of the distribution and installs the RPM ? (or deb or whatever ;-))

    Frankly, this is a very complex topic that involves standardizing a lot more than just the package format.

  39. Briquet says: (permalink)
    November 2nd, 2006 at 3:51 am

    The problem is to make people (I mean people behind Debian, Red Hat, Suse, Gentoo, Mandriva, Arch, etc) understand that having many packaging formats that do basically the same is not an advantage (some of them call it “advantage of many alternatives”), that Linux cannot reach the huge mass of common desktop users keeping steeped the learning curve even to do simple things (like installing some software that is not in the repositories), that many companies will not help Linux until they don’t see a well organized operative system and community (a good example could be drivers) and giving their support just once it’ll be enough to reach all Linux users (not just for Debian or just Suse and so on), that they are fighting for a little share of users, little compared with the hundreds of millions that just one company has (Microsoft), not because is better but understand its business.

    I agree with you that it’s time to push Linux in the same direction, but for this distribution leaders (you already know these so you’re not included) have to understand that everybody has to stand together instead of going by their own, losing a little bit at the beginning like their “precious” package system (and with this having to reorganize their directories) to gain much more at the end.
    We need points of coincidence instead of excuses to continue split the same way Linux is right now.

    If we solve this problem the rest is gonna be easy, the power of coding of Linux developers is huge and doing this’ll not be big deal IF everybody works under the same project or looking for the same goals.

    Thanks
    PS: Two suggestions, first, the packaging system CANNOT be neither .deb nor .rpm, not because those ones are bad, but since the beginning we need something “neutral” among the two biggest ones. Second, a good package system could be Pacman of Arch, I use Ubuntu but is not hard to realize that their community is rather small and they can manage to have a lot of packages and all of them pretty updated, I think their package system is kind of easy to use.

  40. st_lim says: (permalink)
    November 2nd, 2006 at 4:33 am

    While we are at it, let’s also have a /usr/Program Files directory where you can have /usr/Program Files/firefox or /usr/Program Files/thunderbird. And while you are at it doing that, let’s throw away the BSDs and Linux since we really only need one flavor.

    This discussion merely points everything down a slippery path. Having multiple packaging methods is great not only because of choice but because natural selection ensures that the best one wins. Although in the case of package management in Linux, I don’t see certain package management software improving just because there is competition.

    Trying to dictate a package management standard is IMHO a waste of time. Let’s just spend the time getting programmers doing what they like doing, and while they are doing that, get the features we need.

    I do not think the Archilles Heel for Linux at the moment is anywhere near package management. It’s just pure and simple marketing. To see what business development and marketing can do, look at M$. The IT folks know that their opinions don’t matter to the IT giant, but M$ has people who sit and nod their heads sympatetically, listening to the complains, writing them down, and then proceed to sell the “fix” to the problem.

    Now that is what OSS needs, not yet another fix to the so many package management problem.

  41. atdigg says: (permalink)
    November 2nd, 2006 at 5:14 am

    Shuttleworth should be the first person to know why things have to be allowed to become incompatible… he should remember why Ubuntu is not compatible with Debian and why installing .deb Debian files in Ubuntu is a risky business (yes, I know, he talks about packaging not about the content but nevertheless I can install RPM packages in Debian but I could probably not install Ubuntu deb packages)

    Breaking compatibility is necessary for progress, same thing with formats, Debian could have just adopted the LSB “standard” RPM files but they didn’t and it was a good thing to do because now apt-get + deb packages perform better than all the RPM systems, there are even RPM distros that adopted apt-get (PCLinuxOS). You can’t innovate if you don’t do something different. Also there are different needs and preferences, portage works different than YaST or urpmi or apt-get for a reason.

    Anyway my point was, even distros that use the same packaging are not compatible see the Debian and, big surprise, Ubuntu, so what exactly would be the benefit of having similar packaging if the content is different? Would that be MORE confusing, see the RPMs for Madriva, SUSE, Fedora, RedHat, PCLinuxOS and different version of each of those distros.

  42. Cory Pollard says: (permalink)
    November 2nd, 2006 at 5:16 am

    Hi Mark,

    There already is a part-solution to the problem. I’m sure you’ve seen .sh shell scripts for various apps and games around the net. These are dead easy to install and pop-up a graphical installation process, very intuitive too. I think we should explore this option a bit more.
    I belive we should stay away from the “alredy made executable” that has helped windows viruses a massive amount. Maybe we will see a wave of *nix viruses flooding through in the future if the install procedures are too easy.

  43. Greg DeKoenigsberg says: (permalink)
    November 2nd, 2006 at 7:16 am

    So Mark, I understand the lament — but what’s the proposal?

    I don’t think it’s at all controversial to say that the “choice” betweem dpkg and rpm is, essentially, a false choice. I don’t think you’d get a lot of disagreement from folks in the RPM world. Sure, there will be people who advocate for yum versus smartpm versus apt versus rpm versus dpkg, but in the end, there are more commonalities than differences in these tools.

    The critical question is obviously the question of install base. Red Hat, for instance, is not about to move away from RPM unless there’s a *compelling* reason to do so. Why? Because Red Hat has developed an entire business around building, validating, deploying, managing, upgrading, downgrading, and sidegrading RPMs. Same for Novell, same for Mandriva. And, now, same for (ahem) Oracle. And Debian and Ubuntu face the same issues with the large install base of debs. Right?

    The cost of moving from (rpm|dpkg) to (any other packaging system) will be very high — sufficiently high that the reward for the first mover must be *extremely* clear. And let’s face it: packaging, even though it’s a bit of a pain in the arse, is nevertheless *not* rocket science for a sufficiently motivated community. Packaging is work that sysadmins can do when they don’t necessarily have the time or inclination to be upstream developers. Sure, there’s some duplication of effort — but there’s also healthy competition. And as the packaging tools improve for the different communities, the package universes for each distro will continue to expand, even if the package formats themselves never actually converge.

    Perhaps the grand unified theory of Linux packaging will emerge soon. I suspect that if it does, it’ll come from some really bright fellow who just *has* to maintain packages for 7 different distributions, and who therefore solves the problem in pure self-defense, in order to keep his sanity. But I don’t see any compelling reason to force a solution.

    So in theory, I agree that a common packaging format would be beneficial. But that’s the thing about theory and practice: in theory, theory and practice are the same. But in practice, they’re different. :)

  44. Lesley Clayton says: (permalink)
    November 2nd, 2006 at 7:16 am

    “I HAVE A DREAM”….. that one day every valley shall be made high, every hill and mountain shall be made low, the rough places will be made plain and the crooked places will be made straight and the glory of UBUNTU shall be revealed and all people shall SEE it together! (Martin Luther King – slightly changed)

    I dream of experiencing a format that works and is easy. I would love to be able to write code with my granny and my future kids in South Africa using something like visual grammar which is the language of visual spatial thinkers! The kids in South Africa are the apple of my eye and I want them to catch up and unleash their potential!

  45. Tassilo says: (permalink)
    November 2nd, 2006 at 9:21 am

    Yesterday, there were a posting on the site about pkgsrc – IMHO it was a constructive posting and not a rant. And you deleted it. Sorry to say that, but fuck you big time by censoring the opinions you don’t like and that may contradict you. Of course… you will delete this posting, too, I think.

  46. crowmag says: (permalink)
    November 2nd, 2006 at 10:36 am

    More intelligent words on Linux\BSD packaging systems have never been spoken. The fragmentation in this area is a cancer that needs to be dealt with before it gets any further out of control than it is already.

  47. datcracka says: (permalink)
    November 2nd, 2006 at 10:54 am

    OK, so some developer of a packaging system should make a big deal about cross-package compatability. “Our shiz will parse the fudge out of your momma!” and then the open source race is on. Everyone and their large, obese step-brother will be up in arms tryina’ do it. Elbows deep.

    Just my opinion.

  48. pd says: (permalink)
    November 2nd, 2006 at 10:58 am

    I’m a Windows user who is rapidly getting desperate to switch to Linux. The lack of progress in Linux running Windows apps is the biggest hurdle but packaging certainly is not far behind.

    At present not only is there only a couple of average competitors in most application markets on Linux, but the sort of preferential treatment Microsoft gave to it’s own middleware in order to kill competition is inadvertently being mirrored by Linux via limited packaging through default choices and repositories as well as terrible packaging making it a nightmare to whack in a freshly downloaded from a website alternative app of your choice.

    Whilst on the topic, what makes this worse is that even the very latest Ubuntu release produces an error claiming that removing an app from the default chosen, will break your “ubuntu-desktop”.

    Who the hell is going to choose a better application on a foreign platform when they are told they are going to break that platform when doing so?

    Revolutionising packaging is one area Linux could be leading from the front.

    Where 3rd party applications on Windows must generally be updated manually, individually, on top of Windows itself, it’s middleware and it’s Office suite, Linux has the repository concept which is great for mass version/update management but crap for locking people into a subset of applications that is distro dependant!

    How can put this simply (obviously I struggled above).

    Microsoft and co have given the world the example of how simple unified packaging can be. Ok it’s not perfect but it’s good.

    Penguins have given the world the idea of centralised software version/update via repository.

    Someone wake up and merge the two ideas and you will have a competitive advantage over Windows!

    Stop squabbling and take you opportunities to be clearly better than Windows instead of slowly waiting for Linux to ‘be ready for the desktop’.

  49. James says: (permalink)
    November 2nd, 2006 at 11:02 am

    Hi,
    I use gentoo and fail to see the difficulty in typing ‘emerge ‘, which some windows users complain about. Much easier than finding the download, downloading it,clicking on it and going through the install process. Also, if you’re not connected to the internet, you can download the distfiles on another computer, usb stick them over and then run emerge. Works fantastically. Part of the reason I left fedora was that rpm’s dependency checking was rubbish, and getting running and producctive on an ubuntu machine, means delving about turning on the mulitverse.

  50. Tonetheman says: (permalink)
    November 2nd, 2006 at 11:43 am

    and for the same reasons browsers should have the same name across linux distributions, no matter how arrogant the developers for said OS are… if you want consistent packaging, then consistent naming should follow suit.

  51. Janne Kaasalainen says: (permalink)
    November 2nd, 2006 at 11:56 am

    I’m sure me taking part of the commenting is not the most beneficial use of my time, but just to show my support to the efforts once again…

    Just the other week I updated to Edgy and got to install some proprietary software from RPMs. You know, stuff against which the repros have nothing. In any case, alien them to debs, some magic lore at command line and there you have it. Almost, some software does not run anymore. How to solve those? Go to manually delete some *.so from the application dir.

    Now folks come to tell me this is easy? That a normal human being would do on her/his own? Of course, Dapper -> Edgy was a rather major upgrade, but even still such things do happen. People tend to look software outside of repros, for a reason or another. A developer does not make the right package, you need something commercial, grandma wears pink socks… There is a multitude of reasons. Each right and wrong in their own respects.

    Software installations and develoment across distros is a major concern, and I’d be happy to have it solved. Even better if there was decent backwards support in the installation phase. Ubuntu has come great ways towards the ‘normal’ people, and so hear my greatest thanks for doing it. Even if there is still a lot more to do.

    Btw, I agree with the artsy initiative as well, from some days back.

  52. Gabriel says: (permalink)
    November 2nd, 2006 at 11:56 am

    The key to good packaging is integration with the the build system. Everybody has tarballs because it’s a developer tool. Developers should do some packaging. Packaging must be a useful tool for the developer, not only for the distros. Perl has custom packaging, Java has Maven and some others. The diffrence here is that developers use them.

    “make install” is the only tool thas works everywhere (whith perl and java, not anymore) because it’s the tool the developer used.

    “developers,developers,developers,developers,developers,developers,developers,developers,…”

  53. Marc Fearby says: (permalink)
    November 2nd, 2006 at 12:00 pm

    A common packaging format for Linux would be nice, and might make a switch to that OS more appealing, but don’t be fooled into thinking that it’s all beer and skittles on the Windows side thanks to InstallShield. There are a plethora of “packaging” systems for Windows (well, installers really, but “packaging” would be the closest match) that makes administration a complete nightmare for anybody other than a home user.

    You’ve also got Nullsoft Scriptable Install System, Wise, Microsoft Installer (MSI), InstallShield installers that contain MSIs, InstallShield installers that need to install the InstallScript MSI before the main setup will even work, InstallAnywhere, Inno Setups, as well as various other “who knows?” installers that just don’t conform to any of the above installers’ command-line switches, such as Firefox (great browser, btw!) which uses a “-ms” switch for an unattended install instead of the relatively more common “/qb” or “/s” or “/verysilent” switches used by the name-brand installers.

    Sometimes I envy the relative predictability of having just RPM or DEB, so trust me, whilst having a common packaging format would be really nice, you don’t have it all bad!

  54. ernst says: (permalink)
    November 2nd, 2006 at 12:20 pm

    Ubuntu should make auto-updating even simpler. No more “click here to update your system” but updating without notice and involvement of the user (maybe during install one should select ‘complete auto-update’, ‘only security auto-update’, ‘no auto-update’). That is what normal users want and Windows gives them. My no-nerdy friends don’t even see the tray icon, but when they do, they say “Why update” or “It’s there always, I never click on it”, or “Even when I update, I have to update again in two days, it’s annoying.

  55. Alex Hudson says: (permalink)
    November 2nd, 2006 at 12:39 pm

    I kinda agree, but if there is a standard, it needs to avoid the mess that AutoPackage is. At the end of the day, current systems relieve a lot more pain than they cause me, and alternatives don’t seem to work as well.

    I don’t think there’s much value in a standard set of binary packaging conventions: at the end of the day, that really limits you or you buy some other pain (ABI versioning, for example). Packaging needs to be totally distribution-native.

    Source packaging – I could very much buy a standard there. If there was a way of packaging up code which you could feed into a tool, and get *good* native binary packages out – that would rock. Perhaps a system like pbuilder, where you can feed a source package to a variety of distributions, and have native binaries come out for each one: with the correct dependencies.

    You’d need high-level primitives for things like init scripts, figuring out locations, configuring software correctly, distribution-specific patches, arch-specific patches, that kind of thing – you’d need to be able to express that kind of thing without writing distro-specific scripts (is my /bin/sh actually bash?), as well as speccing which development libraries (etc.) you need (so, maybe some kind of API -> packaging mapping?).

    But I definitely see that as the problem. Getting the software built right for the platform in the first place is where the value is – all the other stuff is basically just alien.

  56. Scott Rubin says: (permalink)
    November 2nd, 2006 at 12:43 pm

    I can agree that as far as binary packaging is concerned it would be nice if there was just one to rule them all. Disk space is cheap these days, but time is scarce. If there were something for *nix akin to Apple’s universal binary, we’d be all set. Put the 64-bit and the 32-bit binary in one package format and have it work on every binary distro.

    However, I disagree that there is no difference between the binary package and something like Gentoo’s portage. Source-based distributions are an entirely different beast. It would be nice if a source based distro was capable of also installing binary packages of a standard format, but emerge needs to stay. If anything is evidence enough, my roommate has an old Athlon 1Ghz running Gentoo and a Sempron 2500+ running Dapper. The performance is equal. That’s the power of source-based distributions. No universal packaging system, however nice, should replace it.

  57. Clemens Wehrmann says: (permalink)
    November 2nd, 2006 at 12:47 pm

    I’m sceptical that the problem described is one of package formats. As Mark has pointed out, the difference between package systems on a feature level is decreasing towards triviality. Automatic conversion is mostly solved. The problem that remains is one of differences in packaging granularity, libraries, config file locations and integration with distribution policy. I wouldn’t want to install DAG RPMS on Ubuntu (or even on Fedora sometimes) not because they’re RPMs, but because they make different assumptions and policy decisions than the distribution packagers.

    What might help independent third-party installations is more explicit work through the L(F?)SB specifying third-party packaging:
    * package format (already done, LSB specifies rpm which shouldn’t be a problem for Debian/Ubuntu)
    * static-compilation
    * installing everythig in /opt so its easy to disentangle
    * a capabilities registry (libc version, desktop package) that is hopefully granular enough so that
    1) third party projects could keep the number of binaries they must produce minimal
    2) third party release notes could specify these capabilities as prerequisites and every distribution would have a documented way to implement them
    * a way for native packages to detect and react to third-party installs (be they debs, aliened RPMS, FF plugins, python eggs, ruby gems, C{P,T}AN, etc.)

    So much would already be helped if third party packages would leave /usr alone…

    A commenter above is upset about being unable to install Firefox2. This is a curious example as FF2 coincided to ship with Ubuntu only days after its release so the desire of many would seem to be more timely access to beta software. This is quite difficult because beta software generally depends on the latest versions of all its dependencies requiring either updating lots of system libraries compromising stability, sandboxing, auto-building improvements or static compilation to the extent possible.

  58. peroyvind says: (permalink)
    November 2nd, 2006 at 12:51 pm

    Uhr, nice..

    You’re just expressing what like thousands of other people has said before, nothing original about this post, yet your disciples goes on hailing you for your briliant thougts..

    Anyways, by following LSB & FHS one takes the most concerns of third party (~proprietary) packaging into matter, for the rest, you don’t want to be able to tie distros down to some more detailed standards to be able to run whatever package you want on whatever distro you want, unless you want to start having everything statically linked to avoid ABI incompatibilities, everyone to do everything the exact way and whatever not..

    Oh btw., if you want to follow up on such a topic, why not stop to continosly reinvent the wheels?
    Seems like you want to unite everyone under the same thing and avoid diversity, yet Ubuntu starts projects like new (non-lsb-compliant) initscript project in stead of trying to contribute to others just to mention one example (you know this goes under “diversity”, even directly related to packaging consistency too actually)..

    Diversity is one of *NIX’ strength, not just a big weakness, and consistent packaging is really not such a big issue, sorry, what you want is not possible to to fullfill for real in the open source universe without trying to force things upon others. Standards is great and all, even better if people follow these, for anything beyond that, sorry, you cannot expect everyone to think and feel the same of software, even if you manage to pull anything on this one that someone has tried several other times before without ultimate success, you still will have others that won’t follow and not agreeing on all of these aspects.

    So far FHS & LSB does the trick for most, maybe you can succeed with something beyond that, but I doubt, there’s a reason for it being LSB and not FSB, just like there’s reasons for *BSD not being Linux, sorry, you cannot unite/assimilate everything under one, one should respect that..

    Sorry, I don’t mean to be rude or implying any evil motivations behind anything of yours, but it all seems to be a little too ambisious/naive, time for a reality and/or history check..

  59. Maimon Mons says: (permalink)
    November 2nd, 2006 at 1:15 pm

    There will always be people that download packages off the internet (and always people that publish just .tar.gz files).

    Having yum, apt, rpm, etc. all use the same backend is great, and should be done. In addition, having an application understand what to do with a .tar.gz file and install everything as needed is even better.

    Maybe a parser that reads a .tar.gz, see what is needed (does it have src and require make? If so, install build-essentials.) and then install it. If installation fails, the error log should be parsed, and another attempt at installation should be made, after the build environment is changed. If no changes to the build environment can be made, a clear message to the user should be made of what the problem is. Whether build-essentials should be kept on the system after the install is completed should be an option.

  60. joebobubuntuzealot says: (permalink)
    November 2nd, 2006 at 1:21 pm

    All hail to you Mark!
    You will be the one bringing the open source world together, uniting them all, this is a good battle you can jump into and win!

    Ubuntu will bring peace and unity in the open source community just as George Bush brought peace to the middle east!

    You’re both my true heroes..

  61. Linuxfanboissuck says: (permalink)
    November 2nd, 2006 at 1:24 pm

    You all must be n00bs, wtf are you talking about there being disparate package systems in BSD. NetBSD created pkgsrc and it’s compatible with all of the BSDs. The Linux kiddies are the only idiots who refuse to create one system that is guaranteed to work across platforms. Not to mention there being over 300 total Linux-based OS’s (although “only” about a hundred are mainstream distros http://www.linux.org/dist/list.html). Stop bitching about the problem and help come up with a solution if it bugs you so much. And in case you’re wondering I use Gentoo and FreeBSD for my *nix needs

  62. Salin says: (permalink)
    November 2nd, 2006 at 1:36 pm

    As a user (or previous user) of all the aforementioned packaging systems (Debian’s apt, Red Hat’s rpm, and Gentoo’s emerge) I’d like to make a few observations.

    1. Each system has it’s advantages and disadvantages. Just how there are different window managers (KDE, GNOME, *box, Enlightenment, etc.) there will be different packaging systems. Some will work better than others for what you need.

    2. Binary packages are nice if you shoot straight down the middle of the course (in terms of functionality and optionality) and can be patient for someone to build it for your architecture. Source packages allow some more flexability but still suffer from the need to have it customized to the architecture. Source itself allows you the greatest amount of flexability but comes only with the software developer’s gaurntee of it working with your distribution.

    3. Unifying (or breaking down the partitions) between the distributions would be nice, but given the current state of rabid fanaticisim between the different distributions I suspect that it will be a while before the entire FLOSS community will set down and standardize between all the distributions

  63. Steve says: (permalink)
    November 2nd, 2006 at 1:49 pm

    One problem with “universal packages” is that, although there is great similarity across Linuxes and other *nixes, they all have slight differences in directory layout. Debian has certain ways of organizing /etc that differ from Fedora, and there are larger differences with *BSD and of course big differences with OSX.
    Another reason is that they use different versions of libraries, or even different naming schemes for the same libraries.
    Obviously there are reasons for these differences — they aren’t just random choices. So getting everyone to agree and modify their distros to a common standard is probably barking up the wrong tree.
    Instead, a package could contain a better description of its contents. It could contain a universal, agreed-upon description file… “these are executables”, “these are libraries”, “these are resources”, “these are documentation”… Each distro could have its own package manager, as long as it knows how to interpret this description. Additionally, I suppose, binary apps would have to be compiled in such a way that they can locate their resources in a standard way, it would require only a single common function call, “WhereIsMyResourceFolder()”.
    Linux could put the right files in /usr/share, /usr/lib, etc. OS X could copy the files into a .app bundle.

  64. massysett says: (permalink)
    November 2nd, 2006 at 2:24 pm

    This has been debated for years, but has never gotten anywhere. My general impression of why it has gotten nowhere is because “standard packaging” sounds nice when it’s in the pie-in-the-sky phase, but when someone actually sits down with pen and paper to come up with a concrete scheme, you have to face the fact that there are many political and technical hurdles to “standard packaging” and, furthermore, that “standard packaging” is mostly a solution that is looking for a problem.

    That said, why not mention autopackage? It seems to do at least some of what you may be looking for.

    Also, if standard packaging is so great, why doesn’t Ubuntu standardize on Debian packages? That is, many Ubuntu users say that one cannot take a Debian package and install it on Ubuntu without causing breakage. Why isn’t Ubuntu leading the way?

  65. superzaxxon says: (permalink)
    November 2nd, 2006 at 2:39 pm

    I think that the ‘Gobolinux’ approach is the best for the new “FSB”, getting rid of package managers and the legacy filesystem structure, which another important issue that confuses a lot of ‘newcomers’, why is so complicated when it has can be simplified?

    http://www.gobolinux.org/index.php?page=at_a_glance

    i would like to have a group just examinig deeply 0install, klik, autopackage, gobolinux, etc. and just take the best.

  66. vk says: (permalink)
    November 2nd, 2006 at 2:46 pm

    Personally, my eyes have turned upon something like alien as a possible solution, that has the ability to convert packages into a different format. It’s a small utility and I don’t *think* it would be too much of a problem to add its functionality on to every major package manager and make it more GUI friendly. No one may be able to agree on packages or package managers, but we could at least make it possible to install packages that don’t match what our system normally uses. Alien doesn’t *always* work, but in my experience it has in the majority of cases.

    h**p://kitenet.net/~joey/code/alien.html

  67. Steve says: (permalink)
    November 2nd, 2006 at 3:14 pm

    Someone NEEDS to create a Ubuntu-lik distro based on 0intall. http://zero-install.sourceforge.net/

  68. penguin42 says: (permalink)
    November 2nd, 2006 at 3:17 pm

    I think one of the problems is over constraining packages; often you pick up deb’s that were built on a particular debian derivative or version and have build dependencies of that particular version of libc and other packages when in most cases they will happily work with an older version.
    The difficulty is of course testing since you tend to only test against one of them – but that makes me wonder if a ‘desired’ version and a ‘required’ version would be useful.

    Another problem generically with packaging is installation outside the system hierarchy – most packaging systems really don’t gracefully handle installing a package in your home directory say; this is something that should be possible to let users try things without needing root.

  69. Alex says: (permalink)
    November 2nd, 2006 at 3:30 pm

    I know this might be a little naive but that’s the problem with Linux ‘distributions’: they muddle up applications with the operating system. If Linux distros concentrated on providing a stable, useful OS with all the libraries and features necessary for application providers then a unified package management system would work and it would be good news for users.

    The idea of distro-specific package repositories for getting access to applications isn’t ever going to be the right ‘solution’ in my opinion. Yes, they are a great method of updating and supplanting OS functionality. And yes, they do have a part to play in creating the base on top of which applications can be installed. But they should NOT be used to distribute applications themselves.

    In my mind you should be able to go to the source of the application and download it, then install it on your system. The installer may need to ask the OS to update itself with required libs in order to facilitate the install, but there needs to be a clear division as to what’s the responsibility of the OS and the responsibility of the app vendor/provider.

    Every time you install a Linux distro you always run into problems when you want to install an application that’s not in the ‘stable’ repositories. Either you go for something in the unsupported repositories, or maybe download the source distribution from the provider and try and make it fit your distro. Either way if it doesn’t work you’re left in the dark with few avenues to explore when you run into problems, as you inevitably will. This situation is totally unacceptable for business users and more than a little irritating for home users.

  70. Marko Macek says: (permalink)
    November 2nd, 2006 at 4:18 pm

    The solution:

    1. agree on a ‘future’ packaging standard (rpm or deb or ?)

    2. push the packaging upstream (or at least common ‘linux’ repository between distros).

    As long as distros are maintaining packages separately, there will not be any unification.

    This should be part of requirements of LSB, etc…

  71. WantToDivorceWinXP says: (permalink)
    November 2nd, 2006 at 4:23 pm

    Hello everyone,

    Hey, I’m the “typical” WinXP end-user. I don’t know diddly about packages, apts, sudo, root, boot, filesystems, etc. and yes as an end user I really don’t care to learn. Windows has taught me one thing…I can get my work done without knowing the “nitty gritty” about the OS. I’m sure I echo many voices when I say I want to divorce Windows but I can’t until: Linux is dumbed down like Windows… I give huge kudos to all the folks in the Linux world who have brought my love affair with Linux closer with each year and each new revision. Especially so to Ubuntu for making the first Linux I’ve ever wanted to bring home and introduce my parents to!

    I know dumbing it down is not a real solution and I should probably educate myself better on Linux, but if you want people like myself to make the switch (and stay switched!) Linux needs to make it easy for newebies to get up and running with equal-to or greater functionality we enjoy today. I like being able to goto a website and finding the link for Firefox that says: Firefox for Windows, click here! No confusion, no mess…a double click and a few “Next->Next->Finish”es later and I’m using it. If I want to uninstall it becuase FF2 broke with my favoriate pluggins or whatever, no problem there either…I just click on the shortcut for “Uninstall FF”. I don’t have to think about which library I have or don’t. I don’t need to worry about where it’s installed because it defaults to a single location “Program Files” folder. There’s a simplicity about the dozen or things I need to know in order to make Windows “work for me”. I don’t have to think about opening a command prompt to “complete” an install or download and complie source. It just runs (albeit buggy sometimes).

    I didn’t learn stuff about Windows over night but rather over the course of many many long and painful years. The same will be for Linux, but the first step to getting onto the Linux bandwagon is to remove some of the larger obstacles so I can stay productive while I ramp up. The last time I tried to setup an Ubuntu environment that mirroed the functionality I’ve come to take for granted in Windows, I stopped trying Linux….simply because it got to be too much reading how-tos, entering commands as a special user in a command prompt, library versions not correct, etc. This is one of the first and largest hurdles idiot endusers such as myself face and also the one that keeps alot of us from giving MS the “Big Goodbye!” I love the idea of being able to goto one main repository (website/ftp/whatever) and selecting those programs I want installed and then just having it work! If I want to uninstall something, just double click the short cut that removes it. (Similarly a filesystem structure that makes sense across distros is a close 2nd hurdle…but that’s a different issue all together.) That would be the key feature to convince folks like me to switch…keep me productive…and still keep me intested in learning more as I did with Windows…

    Everyone in the Ubuntu world…please keep up the excellent work! Please don’t be dismayed or offended by my comments, just wanted to offer my $.02 in the hopes that all you Linux guys can help people like myself divorce ourselves from our Windows dependency (and soon to come, extortionist licensing costs of Vista).

    Sincerely,

    WorldlyBedouin

  72. Hendronicus says: (permalink)
    November 2nd, 2006 at 4:31 pm

    I know this might sound naive but what about static linked binaries? You could have the base OS and all of the installable components could be linked statically. Yes they would take up more space, but not that much more if you fixed GCC to work better at relocating routines. If this were the defacto standard on Linux like it is on Windows you would have a much easier time of moving software in and out of a system. I don’t know of any Linux systems like this already, maybe Ubuntu could be the first.

  73. Mariano says: (permalink)
    November 2nd, 2006 at 4:41 pm

    Hello,

    How come no one mentioned the PBI system used in PC-BSD?

    I don’t know what might be wrong with it, from a technical point of view, but it just works.

    Perhaps, use that in all linux and bsd’s?

  74. gianni says: (permalink)
    November 2nd, 2006 at 4:57 pm

    jose hevia got it right. You have to target developers first. Then you will be able to focus on end users.

    Plus don’t so interested by new customers. You have to keep the one using it today, and if you fail, then you will not be able to attract new users.

    I deeply think that you write this down because of the Dapper > Edgy nightmare some users experienced. And if you did, then it’s the good way of doing a distro, by doing everything possible to keep your user base.

    Keep going !

  75. RMX says: (permalink)
    November 2nd, 2006 at 6:10 pm

    I disagree.

    None of the existing package systems are perfect – and the competition among them lets them evolve through forking in the way that F/OSS does the best.

    In the near term, I wonder if we’d be better off seeing *more* packaging alternatives in the short term rather than fewer – and hopefully one of the new ones will handle some of the things that the current ones don’t do gracefully.

    Among the main features I’d like to see a packaging system do well is to UNDO an install that did more harm than good (like every upgrade that breaks X, or vmware, or …).

  76. ljb says: (permalink)
    November 2nd, 2006 at 6:31 pm

    Amen. I couldn’t move to Linux until Ubuntu arrived due to dependency hell. What beginner can understand that. Package management is so much better than on Windows if you do anything semi-advanced (like programming).

    The problem is that IMHO everything other than synaptic is junk. You’d have to get Red Hat and Novell and all the others to scrap what they’ve got and use Debian’s package management.

    And yes, don’t even think about an installer like Windows. Why the heck use that when we’ve got synaptic and gdebi and so on. Anyone thinking software installation is better on Windows is dizzy. Package management is one of the main reasons I stick with Ubuntu over Windows.

  77. Mateusz Lange says: (permalink)
    November 2nd, 2006 at 6:58 pm

    HI!

    I’ve got hope thath you send me a regural mail by post becouse i never recived mail from guy how was in space….,

    This is important to me becouse I’m entusiast about space.I’m student in ETL- this is European Flying Technican School :) Maybe some day I also go to space. My name you have below and adress is

    Mikolajczyka 2/13
    03984 Warszawa, Poland

    Hope hear from You.
    Mateusz Lange

  78. Amirouche A. says: (permalink)
    November 2nd, 2006 at 7:28 pm

    You may create a new working group like the freedesktop.org which will have to deal with such issues. I don’t really know the impact of fg.o on the desktop but I appreciate some standarts, to find an easy way to build menus on my *wm box. It’s not your job ? or you fear some flamewar about your involvement in the Free/Open World. Don’t care, go ahead.

  79. Matthew C. Tedder says: (permalink)
    November 2nd, 2006 at 7:32 pm

    I have been screaming about this for YEARS. I nearly ideal solution presented itself in the form of Autopackage. Autopackages carry the installer and needed dependencies together in one package as to make them painlessly installable on any (GNU/Linux) distribution. It checks to see if a dependency is there and installs them as necessary.

    However, dependency libraries may or may not work, depending on what make-options they were compiled with and there is no way of automatically knowing. Furthermore, sometimes dependency chains can be unrealistically large (e.g. kdecore which then requires a long string of others).

    I have long been preposing a solution of which can now be implemented with these amendments to Autopackage:

    (1) Add the make-options into the file names of .so files like such, “packagename-1.0.2-makeop1-makeop2-makeop3.so”–otherwise you cannot guarantee the stability of the software you install.

    (2) Establish a base library standard that includes the most commonly used library dependencies. This could be a package in and of itself. But the object is simply to reduce the required size of subsequently installed Autopackages.

    (3) Create an on-demand, remote Autopackage composer. A composer-client would check your local system dependencies required for the package, and tell the composer-server what dependencies it needs to put into the package prior to downloading and installation.

    (4) Build a master, BSD-like Ports system for source-code maintenance. This would greatly improve code-quality and interoperability of any participating projects. It could also work toward resolving mutually-exclusive dependency issues, whereby one piece of software cannot be installed simultaneously with another. In fact, a source-based meta-distribution could be constructed for the automatic production of Autopackages that meet all four of these concepts. And a distribution construction facility making it easy for anyone to compose their own distributions could be easily built to act as a catalyst for the proliferation of this system.

    If any of the above four ideas are put into practice, the GNU/Linux systems will become substantially easier for end-users. If all of them are adopted, then virtually every package will be easily installable on any GNU/Linux distribution–and the market suddenly becomes very large and profitable for ISVs, as well. No other concepts could possibly do more to bring GNU/Linux to the masses, particularly so on the desktop.

  80. Matthew C. Tedder says: (permalink)
    November 2nd, 2006 at 7:38 pm

    Oh yes, and the best part is that this doesn’t require the participation of any distribution in particular. It works inspite of them. It just needs people to do perhaps roughly 175-man-hours of development work and then let the system start pumping out Autopackages, automatically. The world is saved.

  81. mario says: (permalink)
    November 2nd, 2006 at 8:13 pm

    I beg that people stop recommending autopackage, because that’s a real non-solution. Unlike typical Windows installers, autpackage archives DO NOT contain dependencies and instead try to download it from the net. Therefore autopackage installers are no better than .debs or .rpms with their dependencies, and don’t allow for easy installation if you try to get it onto computers without online connection.

  82. Worth a Shuttle says: (permalink)
    November 2nd, 2006 at 8:48 pm

    Geez, this Mark dude is a genius. A cross-distro consistent packaging standard! How didn’t anyone think of that before? How?

    So, Mark, how long have you been into this Linux thing?

  83. stelt says: (permalink)
    November 2nd, 2006 at 9:44 pm

    On a different type of packaging, always having THE installer in handy, some thoughts on spreading open software:

    What about a minimal general open software installer, that auto updates, has smart caching and generally “just works”, i call it “the freedom drive”.

    For example: as ‘everybody’ knows the “freedom drive” i ask to borrow the USB stick of the next guy, stick it in my computer, choose my favorite distro to install or only run live (or answer some questions to determine which one i need) and GO! The menu from which to choose is filled dynamically, depending what is available in cache or over a connection.

    For it to really work it should also have free software for Windows, “fix your broken Windows” functionality (Knoppix like), and non-free packages.

    good branding can make all the difference

  84. I do what I can… » The problem about “consistent packaging” says: (permalink)
    November 2nd, 2006 at 9:58 pm

    [...] I was reading Mark Shuttleworth’s comment about Consistent Packaging, and although I’ve not yet read all the comments on his post, I have my own concerns that I wish to express here (so I won’t pollute his mailbox with redundant comments, if that’s the case). [...]

  85. nate says: (permalink)
    November 2nd, 2006 at 10:58 pm

    You dipshit.

    Your great and Ubuntu is wonderfull and all that. I appreciate what your doing and all that.

    BUT SERIOUSLY.

    One of the major critisism of Ubuntu early on was that you guys didn’t plan on remaining comaptable with Debian Stable and Stable’s packaging sceme.

    You guys said ‘Oh, no, we don’t need to join the Debian consortium and make sure that packages are compatable. That isn’t a problem for Ubuntu!’ and some such bullshit.

    And now you come back and say that we need to have more consistant packaging??!!

    I can’t beleive this! UBUNTU IS PART OF THE PROBLEM. YOU COULD OF REMAINED COMPATABLE, BUT YOU SPECIFICLY CHOOSE NOT TO!!

    I mean seriously. Choose one way or another.

    Remember the DCC?
    http://www.dccalliance.org/
    Remember Componentized Linux?
    http://componentizedlinux.org/index.php/Main_Page

    You remember other Debian-based operating systems working together on making sure that software remained compatable with packages and software?

    People working on finding pratical solutions to software compatability problems in Linux…

    Now do you remember COMPLETELY BLOWING THEM OFF?

    “The Premise. The vision behind DCC, which is indeed compelling, is that it would provide a common platform for certification, and that the distros that make up the DCC would all ship exactly that same core. But it strikes me that this approach has never worked in the past. In fact, every distro ALWAYS modifies elements of the core, and with good reason. And while we would love that not to be the case, the truth is that the reasons to specialise outweigh the benefits of homogeneity.”

    YOUR QUOTE.

    Now people are treating you with this ‘Consistant Packaging’ blog entry like your a freaking genuis or something.

    YOU CAN’T HAVE CONSISTANT PACKAGING WHEN OPERATING SYSTEMS PURPOSELY BREAK COMPATABILITY.

    How the hell are you going to ensure that software remains compatable? Even if all the software uses the same freaking package management sceme it’s still not going to fucking work because the files are going to be in different places and the library version differences will ensure that software remains incompatable.

  86. Netmaven says: (permalink)
    November 2nd, 2006 at 11:12 pm

    A GOOD EXAMPLE — My son at 4 YRS OLD can easily install programs on MS Windows.

    1. Double click the program
    2. Click the “yes/next” buttons
    3. Click the new icon on my desktop to play my new game.

    It’s really that easy for him. Why can’t Linux / Ubuntu be this easy? My son is now 6 YRS OLD and still can’t install Ubuntu programs to save his life!

    My point is Mark — is that I love Ubuntu… But until Ubuntu is THIS easy to install programs (4 YR OLD easy) people will keep using Windows :(

    People (General Population) are a lot like my son – they just want software so easy a (4 YR OLD) could do it!

    Give them that Mark and you will rule the world :) – Best Wishes.

  87. Alex says: (permalink)
    November 2nd, 2006 at 11:29 pm

    Well, just my $0.02 here.. As a USER of an OS all I want to do is INSTALL a software and then use it, preferably without a reboot… I DON’T want to know anything about “packages”, “apt-get” crap, “synaptic”, etc… I WANT to download something, click it and ‘voila’ – it’s on my machine… That’s what Linux developers MUST understand if they really want this OS to be widely used.. But I believe all they really want to (at least most of them) – is just pretend to look cool and smart – “wow, I am a Linux guru blah-blah-blah…” That is why a commercial product is forced to adapt to what users want so they keep buying it, right? Computers are for people, not for geeks who live with their computers 24/7.

    I think a know a bit about computers and I started playing with Ubuntu recently as a second OS on my laptop.. But seriously, if I have to google for an hour to figure out that my WiFi with WPA2 works only with extra software that I have to install typing 5-6 commands into a terminal window, then I will think – “well, that sucks!”… Or if I want to install something and it keeps telling me about some packages having wrong versions and so on – I will be upset and will decide to switch back to Windows where I can just run .EXE file and get the same software working… And seriously, COMPILING stuff??? Are you out of your mind??? Even I, being a developer with many years of experience, DON’T want to compile somebody else’s crap code! I’m sick of it at work and my home projects! As poster ‘fish’ said – it’s 21 century, get up to speed and make your system USABLE!

    P.S. Mark, I think that Ubuntu is the best distro out there and I understand that you alone will not be able to change the way things are done. I appreciate your involvment and contribution, but I am more and more pessimistic about free software (on a large scale) success due to the fact that there are way too many arrogant people involved in “free software” development who are not going to change.. Like the story about Firefox and their name and logo.. And it’s a bit different when there is money involved, wouldn’t you agree? It’s a shame…

    P.P.S. And I cannot stand it when people say something like one of the posters here said: “dpkg -i :-s or add the repository for firefox 2 and apt-get install :-s”… THIS IS DUMB! Clicking on an EXE is EASIER! That’s what it boils down to for 99% of people out there!

  88. Artem Vakhitov says: (permalink)
    November 2nd, 2006 at 11:43 pm

    You should seriously consider Autopackage. It’s a great packaging framework for application level (non-core) software that provides real distro independence by resolving C++ and GLIBC ABI incompatibility problems and providing universal binary relocatability. The only technical reason currently preventing its takeoff is the fact that it’s forced to use /usr for multiuser install – due to the poor support for /usr/local and /opt prefixes in distros – and therefore can cause conflicts with the native package manager. (The specific list of distro-related problems can be found here: http://plan99.net/autopackage/What_can_distributions_do_to_fix_broken/usr/local_support ). However, this is easy to overcome if standardization on these issues takes place.

  89. elvenmuse says: (permalink)
    November 2nd, 2006 at 11:59 pm

    I agree. We should standarize on the package manager and format. Using .tar and extending XML? Using URLs as IDs in tar/info.xml?

    I’m interested and organize my time to always have an amount of free time. I can help with ideas and some coding if necesary.

  90. macewan says: (permalink)
    November 3rd, 2006 at 12:42 am

    For people new to Lin* I’d prefer the way .dmg’s are installed on my iBook – drag the program to your ‘applications’ folder input your password in the popup window & it installs.

  91. jobezone says: (permalink)
    November 3rd, 2006 at 2:00 am

    I don’t want to start spamming the comments with links, but there is an interesting project at http://www.getdeb.net which packages software which is not in ubuntu’s repositories.

  92. jsc says: (permalink)
    November 3rd, 2006 at 2:57 am

    From the users perspective the mac way of installing is by far the best. It is just so easy. Start a download then as the download is finishing the OS automatically recognizes it as an app, unzips it and then you drop it into your applications folder. Finished. Now with universal binaries it is one download for both intel and powerpc systems. Software provided by apple can be updated with one weekly check. A lot of 3rd party software check automatically for updates and then asks the user if they want to update.

    Obviously you don’t want server software to work this way, but for desktop users it really is great.

  93. Shuttleworth: Linux distros need consistent packaging « This too was Dugg by the Reverend Deacon Nikolai says: (permalink)
    November 3rd, 2006 at 4:17 am

    [...] read more | digg story [...]

  94. Steven Wagner says: (permalink)
    November 3rd, 2006 at 5:13 am

    Fragmented packaging systems are a problem, and fragmentation is the problem in general. The two largest channels on freenode are #ubuntu and #gentoo. Find a way to unite gentoo’s source based distribution (emerge) with a binary package system (apt-get), and the two largest linux user bases that exist will be able to merge together since they have the same type of community and the same common goal. Everyone would like to see this happen.

  95. Ken Joy says: (permalink)
    November 3rd, 2006 at 6:58 am

    Linux packaging should be united, i will continue foucs on this topic. Mark, new ideas welcome.

  96. Joseph says: (permalink)
    November 3rd, 2006 at 2:09 pm

    Hello Mark,

    You are right on target on the issue of cooperation in the FOSS community. I have often dreamed of the same inter-community cooperative spirit for FOSS’s future. Together we can overcome some serious obstacles to its adoption. Having been in the BSD, Unix and Linux communities I have seen and felt the tension often. These childish notions are even drastically perpetuated within our own Linux communities. Truth be told none of us would have FOSS if it had not been for the amazing work of the projects that came before us which are from all ends of the FOSS community. We should recognize our collective ends are the same and start working toward them.

    Mark you are uniquely positioned to be a pioneer here once again. =) Rest assured many of us are here waiting for the day when we can see a cooperative FOSS development model as the norm. Count me in! Long live the Free and the Open!!!

  97. Links for 2006-11-03 at 爱晚尚明 says: (permalink)
    November 3rd, 2006 at 3:50 pm

    [...] November 3rd, 2006 by J linux, software, web, windows, 推荐阅读#12: Consistent Packaging [...]

  98. Wearyman says: (permalink)
    November 3rd, 2006 at 7:21 pm

    I agree with mrk on this one. This is one of the things that has kept me from swapping over to Linux full time. I’m sick and tired of wanting to install something and having to frigging compile it myself, or running into dependancy hell. Ubuntu has brought this a long way with it’s use of synaptic, but I think Linux Torvalds needs to take a stand on this one and select a single installer package for the Linux kernel, and do everything he can to force all the various distros to use it.

    I know he doesn’t “run” the linux kernel like BG runs MS, but what he says carries alot of weight, and people listen to him. If he were to stand up and say “We need to pick one package management system for all distributions, and my choice would be X.” then most vendors would likely follow that. Within a few years we could get everybody swapped over to a single system if we worked together.

    Alternatley, we could have a consortium of distro heads get together and collboratively design a system that would not only have it’s own package setup, but was backwards compatible with the current formats. This would make switching easier for all concerned, and nobody would lose face.

    No matter what we do, we need to do it quick. This is 2006, we should have fixed this by now.

  99. António Meireles (aka doniphon) says: (permalink)
    November 3rd, 2006 at 7:34 pm

    The first part can be summed in one word – *Demagogy* … No more, no less. It is true that from an end-user point of view the underlying package manager doesn ‘t matter too much, but that is simply false to developers. Religious questions apart, neither rpm, nor apt (to quote simply the big ones) scale, nor they are simple enough. They have a big side effect (some may say it it an advantage) in that they tend to force big and monolithic distros (the lots of derivative distros that come from RH or Ubuntu or debian are not a a signal of the strenght of rpm or apt but the evidence that **by definition** nor rpm, nor apt are suitable to scale)

    There is off course an alternative, a damn good one – conary (http://wiki.rpath.com/wiki/Conary) that even works… the trouble with conary (from the big players point of view) is simple – it changes forever the economics of linux based OS, in the sense that it changes the paradigm. So, in short they say packaging doesn ‘t matter – it matters a lot. Linux isn ‘t supposed to be simply simpler to users, it should and can be simple and productive to devs – conary allows that, really.

    P.S. conary already talks (s)rpm – which is natural since the people who is developing it wrote … (yes!) rpm. With minimal effort it could understand debs. If Mark is really serious, then why not be bold ? (or does Mark suffers to of the not-invented-here-syndrome too, as others ?)

  100. JanC says: (permalink)
    November 3rd, 2006 at 9:24 pm

    macewan: you can double-click a *.deb to install it, which has almost the same “complexity” as the *.dmg drag’n’drop…

  101. cantormath says: (permalink)
    November 3rd, 2006 at 10:30 pm

    People are to haste to install packages that are not always stable. And the worst thing is, the folk that write the packages will tell you the paticular item is stable or not so stable. People need to stick to the stable versions, especially when do a complete conversion from windows to linux. My department wanted to switch everything over to ubuntu and had decided to go with edgy…….I told the sysadmin this was not such a good idea and edgy is call edgy for a reason. Same thing happen when everyone started to switch over to dapper. They all expected stability in the beta times and would come on the boards pissed that particular packages were not working.

    just a thought….

  102. macewan says: (permalink)
    November 4th, 2006 at 12:08 am

    @Wearyman, what does he have to do with this? Stallman would carry more weight in a situation like this than Linus.

  103. Harlem says: (permalink)
    November 4th, 2006 at 4:41 am

    I am inclined to agree with you. A friend of mine at work knows that I do a podcast about ubuntu and I tout open source operating systems all the time. He is of course a Microsoft user and he states that he always will be. When asked why he simply states that it is difficult to install software on to linux based distros. I get the usual statements like “I have to also download all the dependencies” or “why cant is be like windows with one click intallations?”. Yes, I have told him about synaptics package manager and tools like it, but he remains unconvinced. Standards in packaging will help users of all operating systems see that linux based distros are not just a bunch of wayward revolutionists.

  104. Lesley Clayton says: (permalink)
    November 4th, 2006 at 9:59 am

    Mark, you know what?
    I am really enjoying your blog space – I have been learning alot of new things from what you and others post! If I am learning then I am growing and thats all the good stuff!
    So actually what I am saying is that I really appreciate it!!

  105. olorin_ says: (permalink)
    November 4th, 2006 at 12:18 pm

    “So, Mark, how long have you been into this Linux thing?”

    A lot longer than you I guess and I’ll bet development of Debian doesn’t show on your CV.

  106. Don Ray says: (permalink)
    November 4th, 2006 at 2:18 pm

    With the Microsoft and SuSE announcement, do you see this as being good for the entire Linux community?

  107. Amirouche A. says: (permalink)
    November 4th, 2006 at 4:35 pm

    Lesley Clayton: Mark is far from being god, or anything else that may match this. you may learn tons of other things in other blogs, stop being stupid and open your mind.

  108. Lesley Clayton says: (permalink)
    November 5th, 2006 at 8:00 am

    Amirounche A- Now talking about an OPEN MIND – you have no idea what am I actually watching and learning – talk about NARROW MINDED you could probably “look through a key hole with both your eyes!”

  109. Shuttleworth: Linux distros need consistent packaging » KOKYUNAGE NEWS » Not the usual mix. says: (permalink)
    November 5th, 2006 at 8:39 am

    [...] A long, long time ago, packaging was an exciting idea. There were disputes over style and process, there was innovation. There were reasons to prefer .deb over .rpm over emerge and it ’s binary packages… Today, these differences are just a hindrance.read more | digg story Related Articles: Interview: Malcolm Yates of Canonical (Ubuntu)Canonical is the company behind Ubuntu: one of the fastest-growing Linux distros on the market today, and certainly one of…Shuttleworth Venture Capital Invests In South AfricaShuttleworth is investing money in South Africa. Perhaps if this venture is successful it will convince other VC firms Africa…Why Linux Isn’t Mainstream (Yet)With the ease of installation, maintenance, and use of many recent Linux distributions, such as Ubuntu and Fedora, some are…Linux Computer About the Size of a French FryThe GumStix NetStix 400xm-cf is Linux based and has a 400MHz Intel processor coupled with 64MB of RAM. And that…Teeny Linux PCs proliferateA small company has begun building its line of tiny, gumstick-sized board-level computers into miniscule packaged PCs that displace around… [...]

  110. patpi says: (permalink)
    November 5th, 2006 at 11:13 am

    http://autopackage.org/ui-vision.html
    The User Interface vision by Mike Hearn

  111. Shuttleworth: Linux distros need consistent packaging « This too was Dugg by the Reverend Deacon Nikolai says: (permalink)
    November 5th, 2006 at 10:24 pm

    [...] read more | digg story [...]

  112. Brent Royal-Gordon says: (permalink)
    November 6th, 2006 at 7:06 am

    I’ve been thinking about this problem all day. I think that we could create a Universal Package somewhat similar to Apple’s Universal Binary.

    Essentially: Build a version of the package for each system (distribution/version combination) you want to support. Go into these native packages and remove any compression (like the two files in a deb). Designate one of the native packages as the “base version” and take binary diffs between it and each of the others. Now package up the base version and all the diffs into a .up file, compress it, and ship it out.

    To install a Universal Package, you decompress it, extract the base version and the diff for your system, apply the latter to the former, re-apply whatever compression you removed to create the diffs, and voila–you have a native package appropriate for your OS. Just install it.

    There’s nothing in Universal Package to handle dependencies, but that’s okay–the native package should contain that information.

    I think this will produce fairly reasonable sizes because a lot of the stuff in a package doesn’t really vary between distributions. Resources and documentation may change locations, but their contents are pretty much unchanged; heavily algorithmic code shouldn’t require much alteration either. And really, how big are the differences between a Debian and an Ubuntu binary, especially considering that so many Debian binaries run perfectly on Ubuntu?

    Incidentally, you could use this to create multi-platform packages, although the diffs would be much larger in such cases. (You might want to have a sort of multi-level system with this where amd64-ubuntu-6.10 packages are created by starting with the i386-debian-sarge base, applying the amd64-debian-sarge diff, and then applying the amd64-ubuntu-6.10 diff.)

    (Hmm…it’s a shame that Summer of Code is now six months away–this might make a good project for it…)

  113. Maimon Mons says: (permalink)
    November 6th, 2006 at 12:28 pm

    The one thing about a “universal” repository: Dependencies will have to be “real” and not just set to the latest versions of a package. This is actually a problem with Ubuntu as well.

    When running Dapper, I saw a package or two in Edgy that I wanted to install. I downloaded the .debs from the edgy repository and tried installing them in Dapper. It said that I didn’t meet a lot of dependencies that were just version bumps in packages that I already had installed. Now, if those new package versions were absolutely necessary, that’s one thing, but I could have just compiled from source the package and installed it on my system, suggesting that it’s not a new version I needed at all.

  114. Reasons Why Ubuntu Fails at Not Usable says: (permalink)
    November 6th, 2006 at 2:27 pm

    [...] Installation and removal of software: After a couple of months of using Ubuntu I am still not entirely sure how exactly software gets installed. Using Synapsis is by far the easiest way I have ever installed anything but what if what I want isn’t available through Synapsis? Same goes for removing something. Just way too much hassle. Example: Blog entry about removing Amarok. Mark Shuttleworth is apperently also not blind to some of the faults as his post Consistent Packaging suggests. [...]

  115. Mark “Ubuntu” Shuttleworths Welt auf F!XMBR says: (permalink)
    November 6th, 2006 at 3:17 pm

    [...] Shuttleworths Blog [...]

  116. Miere Teixeira says: (permalink)
    November 6th, 2006 at 5:53 pm

    THis is a good question… But the answer every one know… We has to “standarize” the packaging system urgently!!! Too many peoples here have talk about unresolved dependences, Windows and MAC way to install an application… Thats is the idea!!! We have to talk with all of distros “boos” and take a consense…

    Linux is a good tool for me, that know linux system and use it about 4 years ago… But some “new users” dont wants to know what´s debian package or a RPM package… I realy discard the option to compile applications… The linux will afront Microsoft only when it became more easy to use an aplication, like this application and could show it to a friend that use Linux (this linux should not be the same, right??) It is the WHISH of an user…

    A great mirror with packages its ridiculous… The internet its a good way to install and share application, i know… But, it spend time to download them before install them… It should be better if you hava an “installer” on a PenDrive, (DV/C)D(R/rw…), floppy (rsrs… slowly as download from internet… srsrs..), (…). I wanna this… MAny peoples here wants this… And any more at world wants this…

    Use GNome, Kde, Xfce, EDe? it does not matters… Some peoples has differents preferences to do/view/organize your thing, documents (…). The important one is that any one on linux could do the same thing on linux as easy as on a Windows or MAC system…

    Am I a dreamer?!!?? I think almost you´ve been said are right, Mark! THe Ubuntu has grown becouse of your “freedom thing way”!

    But… This is just my opinion!!! And anymore!

  117. Oliver says: (permalink)
    November 6th, 2006 at 11:01 pm

    It might be a good idea to unify the package formats (which are an essential part of most Linux distros). But, regarding the comments, here are some things that you might want to consider before developing the next great package format :-)

    - though Desktop users may want to have the latest bleeding-edge packages, companies (esp. software developer companies) want a stable version (like Dapper). Commercial Linux software is often built for a specific combination of commercial third-party libraries, commercial in-house-developed libs, and the distro (most customers just say: “it has to run on RHEL4″, so it is built and tested only on RHEL4)
    - a distro is more than just lots of independent packages; even though the binary compatibility might be given, it is also important that the packages work together for things like start menu entries, not overwriting their files, interprocess communication…
    - take into account that many sysadmins not only use rpm/apt for package management, but also build their own packages (at least at our company they do :) and so it’s not the simple solution to “make apt/rpm/yum just a frontend to the new unified package system”; you also need to provide ways to convert old packages and old knowledge to the new system.
    - if you use software packages like Apples .dmg, you have many versions of the same library on your system. In Ubuntu, a security hole in imlib means that I have to update _one_ package; under Windows and Apple, the libs are strewn all over the disk (in .dmgs or in program folders). Updating these many lib versions is difficult.
    - for the people favouring Apple’s easy software installation (“just one package for 10.4, and one for 10.3 – where’s the problem?”): Have a look at Opera’s Linux packages! There are a lot more Linux distros than MacOsX versions – so there are a lot more Opera packages for Linux; but indeed they offer packages for really many distros (you just select Debian -> Sarge -> Download and get a .deb for Sarge; or Ubuntu -> Edgy -> Download, and get a .deb for Edgy; same for .rpm and many other formats). Btw. there’s a repo at deb.opera.com, so it nicely integrates into Synaptic etc.
    - basically, if you don’t want to provide so many different packages, you have to reduce the number of different distros :-) so that you end up with “Linux 2007″, which is then in line with “Windows 2007″ and “Mac OS X 2007″. And then you’d have your unified Linux, and no more choice… Ouch…
    - For satisfying bleeding-edge-users, it might be nice to have easily installable source packages, so that you can use the latest source release and install it as easy as a .deb. Not sure, but doesn’t Gentoo do it like this? Also, if you want latest version, just use “Ubuntu Unstable” (ie. the development version), but don’t complain that it’s unstable. If you manage to build a stable (!), bug-free distro with 8-hours-from-CVS-to-Desktop release speed, you can apply for Nobel price for Computer Science ;-)
    - before I forget it: InstallShield etc. suck big, compared to .deb and .rpm :-D
    - a question for all Linux-Fans here: what’s your goal when you promote Linux to friends/relatives/co-workers? What do you hope for with that? IMHO the great thing about Linux is the freedom to hack it, to combine different parts, and generally not being hindered by the rigid framework that Windows and Mac OS X have. So, I guess even if Linux would succeed to become a system with > 50% desktop market share, it would then have lost its free structure and be Just Another OS.

    So, about the packaging: I think it would be nice to have some unification, esp. if it could mean that upstream devs would do the packaging (for _one_ format). But the current system _does_ have its advantages, and it would be important to preserve these advantages in a new system.

  118. Jon says: (permalink)
    November 7th, 2006 at 12:42 am

    I had a Citroen GS car many years ago. Had all kinds of groundbreaking technology including hydropneumatic suspension and great aerodynamics. It was soon evident that the designers had done a great job on all the difficult stuff, but the more mundane things had not troubled the attention of the designers. As a result, some routine maintenance jobs were near impossible. “A great car for the skilled mechanic” would have been an accurate but probably rather unsuccessful advertising slogan.
    Cars have moved on from the days when every driver had to be skilled at repairs because they broke down on most trips. Now some cars don’t even let their owners go near the motor, but that doesn’t matter because they don’t need to.
    The majority of computer users don’t expect to need to go ‘under the hood’. Windows ‘simplicity’ may be illusory, but the illusion is sustained well enough. Most users expect it to break occasionally, but will then get someone else to fix it.
    What this suggests to me is that the first Linux distro to crack the packaging issue will be the one that gets a huge user base. It probably won’t appeal to the mechanics, but for the masses who just want to use computer to get the work done, A to B without dirty hands is the key thing.

  119. Trev says: (permalink)
    November 7th, 2006 at 1:15 am

    Not relevant but!

    I would like to congratulate all the people making such a concerted effort to promote Ubuntu.

    Unfortunately the Linux industry seems to have an inability to answer questions in plain language.

    A young man of 20 goes to his lecturer and says “Can you tell me how to get to the library”

    The lecturer answers “Verlaten draai. Ga onderaan de zaal en neem de eerste deur op het recht. Ga onderaan drie vluchten van stairs en het is de derde deur op het recht.”

    The young man answers “Why are you speaking Dutch to me?

    The lecturer responds “It is the language we speak now. We have millions of people working on it, to make it the next language. All the world will be speaking it soon”

    The young man says “How can I know what you are saying if you don’t tell me in English first?”

    The lecturer says “But if I don’t tell you in Dutch how will you learn it?”

    But, says the young man. “I am not an idiot with languages; I have been speaking one for the last 20yrs. I only need you to tell me what you mean in mine, so that I can understand the new one”

    To which the lecturer says “Turn Left. Go down the hall and take the first door on the right. Go down three flights of stairs and it is the third door on the right”

    This is essence is what is holding Linux back. The inability to explain in simple language what things mean. The program is there, competition between programmers is already there. What people need is English teachers. Let me give you an example.

    To set up dual monitors on your Windows machine.

    On the Desktop (the first screen that opens when you start your computer) you will see an icon called “My Computer”
    Right click on the “My Computer” icon and click on Properties.
    On the window that now opens you will see (along the top) tabs. Click on the one that is marked Settings. Etc; etc; etc.

    Now that is English. Why why why is that so hard? Let’s have some visuals, screenshots or even Flash.

    You will get more people to make the transition with good support more than good programs. Strange comment but true. Please promote the people that can teach and not System Engineers who think everyone knows what a .tar is. The program does not have to be simple (Windows proved that) but the understanding has to be.

    My rant over, as is my shot at Linux (for now) for the frustration of the week has got to me.

    Trev

  120. [Not Usable] Reasons why Ubuntu Fails « Rising Sun says: (permalink)
    November 8th, 2006 at 12:07 pm

    [...] Installation and removal of software: After a couple of months of using Ubuntu I am still not entirely sure how exactly software gets installed. Using Synaptic is by far the easiest way I have ever installed anything but what if what I want isn’t available through Synaptic? Same goes for removing something. Just way too much hassle. Example: Blog entry about removing Amarok. Mark Shuttleworth is apperently also not blind to some of the faults as his post Consistent Packaging suggests. [...]

  121. Ahmed (ihavenoname) says: (permalink)
    November 14th, 2006 at 10:05 pm

    Excellent point Mark. Heres the real question though, what are we going to do about it? Is Ubuntu going to help with autopackage and move in that direction? How do you get hundreds of distros to switch over to one packaging standard? Especially after so much work has gone in to getting their distros fully integrated with their package manager? Take Fedora and yum(pup, pirut, anaconda now using yum). Even Ubuntu and gdebi and synaptic. I think smart is one way to go, if all distros adopt smart then at least we are all resolving dependencies the same way and we are one step closer to the desired goal. At the same time we can also try and get distros to name packages the same way. From there it will just be a matter of merging packaging standards on both rpm and debs. Of course there are flaws to this method, but at this point it is only a suggestion. The important thing is that we do keep pushing towards this goal. I do believe this is one of the main reason many companies stray from releasing Linux versions of their software or drivers or what have you. While this may not be the only reason it is a pretty important factor. Let’s hope this ‘issue’ gets resolved. Mark I wish you the best of luck in this venture. I’ll help if I can.
    Good day
    AhmedG. (aka Ihavenoname)

  122. Zolookas says: (permalink)
    November 15th, 2006 at 2:38 pm

    I’m sorry to say that, but it’ll never happen. If ubuntu team will make some changes, other distros will ignore it (distros almost always ignore each other). If you took a part from every distro you will able to make ultimate one. Another thing is dependencies. If program requires library X and distro Y has it, but distro Z don’t. Distro Y will probably try to make package without that depencency and everything starts from beggining…

  123. alexxx says: (permalink)
    November 16th, 2006 at 8:48 pm

    Hi Mark,
    I’m a happy Ubuntu user.
    I had an idea: hold package standard (deb) but make it distribuited and updated via RSS.
    To make it an actual system we need:
    - define a standard package fundation, I propose every packet depending by “ubuntu-desktop.deb” et alii
    - certificate software by community, to prevent some sicurity trouble avoiding spyware and other malware
    - distribute this software with all exotic library, like a metapackage (libraries usually are very light)
    - restrict what dpkg could do, it should execute some standard tasks, like create new file and update other and so on, and not modify iptables if it is not what it is created for.
    This will work if developers and their community want to provide package directly. This should improve commercial appeal of Ubuntu, because in this way we can give a standard system for commercial software too.
    These ara just ideas, excuse if these are not what you are searching for.

  124. fish says: (permalink)
    November 17th, 2006 at 2:44 pm

    I wonder why we are even discussing this…

    In 5 years I’ll try linux again, and it will be like now, 5 years ago, 10 years ago…

    Still no consistent packaging and a huge waste of time in the linux community!

    Nothing will ever change in this area, distros are way too narrow minded for that, including Ubuntu.

    Also, I wonder what the point of this blog is when Mark wouldn’t ever reply to any of the comments anyway.

    Maybe one day you guys will realize what big a chance for the linux desktop u are wasting here…

  125. Mike’s Journal » Blog Archive » the ubuntu devconf says: (permalink)
    November 20th, 2006 at 5:35 am

    [...] This sort of conversation makes me very cynical about the relevance of the OSDL and LSB. It is easy to pick on Ubuntu here but in practice I know this kind of thinking is endemic to the distributions … it would have been exactly the same at a Fedora conference. It was a shame that Mark Shuttleworth didn’t show up – his blog entry is the only reason I went along, as I’ve had similar conversations via email before so knew what to expect. Unfortunately it sounded like he had a lot of pushback on those plans and they are now being watered down to simply being “it’d be nice if source compiled the same on every distro”, which is basically where we are today, sort of, except because there are no ISVs there’s no real incentive to keep it that way and sometimes it breaks. [...]

  126. FelixR says: (permalink)
    November 20th, 2006 at 10:05 pm

    It would be great if the LSB releases every 6 months a new release. Plus defining the FHS more concrete where KDE, Gnome etc. has to be installed. With that combination distributors would be able to release a new distri every 6 months under a new LSB version. So you only need LSB compatible packages for the different LSB versions which run on all distributions which are compatible with the different LSB versions:

    app-xy-0.95-lsb-2.0.deb
    app-xy-0.95-lsb-2.1.deb
    app-xy-0.05-lsb-3.0.deb

    That would decrease number of packages needed for the different distributions. If they would use only deb or rpm format there would be only 3 different packages for a period of time of 1,5 years which would run on all LSB compatible Linux distributions.

  127. flabdablet says: (permalink)
    December 5th, 2006 at 1:09 pm

    Thanks to Mark for opening up this discussion, and thanks to Steve for the link to http://zero-install.sourceforge.net/ – I will be installing this on my Ubuntu Edgy system very soon and doing my best to contribute to the project’s success, because it looks to me as if the concepts behind the zero-install project are the Right Way Forward.

    It’s even easier to use than the Windows app installation process, because you don’t need separate download-installer, run-installer and run-application steps; having found a zero-install app on the web, you just run it.

    Everything else – downloading the code, downloading dependencies, applying updates – happens automagically. You don’t need admin rights just to install an app, you don’t need to mess with a repository list, and distro developers don’t need to maintain repositories. Software developers are in charge of specifying their own dependencies, and any number of these can coexist, even if they provide similar services; if multiple apps share dependencies, what they have in common only gets downloaded once.

    As a site admin, all I’d need to do to set up my workstations would be to provide a base OS with a bunch of links to zero-install apps, and a web proxy on my LAN. What’s not to like?

    I use Ubuntu because it’s nicely integrated, and gets more so with each release; every six months, more things Just Work. It strikes me that Ubuntu getting behind the zero-install project and making it the standard way for Ubuntu users to acquire apps may well be a good way to make that experience scale.

    As far as I can see there’s nothing stopping Synaptic and zero-install from coexisting in the same distro. As releases goes by, though, I can also see no fundamental reason why zero-install couldn’t progressively take over from apt. Comparison here:

    http://zero-install.sourceforge.net/compare.html

    The Ubuntu way to persuade other distros to get on board seems to me to be by doing something compelling enough that what the other guys are doing becomes obviously deficient by comparison. At first glance, zero-install strikes me as exactly that kind of advance.

  128. Ankur Gupta says: (permalink)
    April 18th, 2007 at 4:49 am

    I am persuring B.Tech in Amity ,but not happy here i am looking for an online computer language test on c,c++ for money factor only thats national or international level that doesn’t matter to me,so i am waiting for ur reply…….

  129. Mighty Linuxz » Shuttleworth: Linux distros need consistent packaging says: (permalink)
    November 2nd, 2007 at 12:09 pm

    [...] read more | digg story [...]

  130. Ubuntu + Gentoo = Nirvana? « Inkless Paper says: (permalink)
    December 14th, 2007 at 3:11 am

    [...] I was skimming through some of the older posts on Mark Shuttleworth’s blog when I came across this one that struck a chord. While I do like the whole Debian apt packaging system, for the most part, [...]