I had the opportunity to present at the Linux Symposium on Friday, and talked further about my hope that we can improve the coordination and cadence of the entire free software stack. I tried to present both the obvious benefits and the controversies the idea has thrown up.

Afterwards, a number of people came up to talk about it further, with generally positive feedback.

Christopher Curtis, for example, emailed to say that the idea of economic clustering in the motor car industry goes far further than the location of car dealerships. He writes:

Firstly, every car maker releases their new models at about the same time. Each car maker has similar products – economy, sedan, light truck. They copy each other prolifically. Eventually, they all adopt a certain baseline – seatbelts, bumpers, airbags, anti-lock brakes. Yet they compete fiercely (OnStar from GM; Microsoft Sync from Ford) and people remain brand loyal. This isn’t going to change in the Linux world. Even better, relations like Debian->Ubuntu match car maker relations like Toyota->Lexus.

I agree with him wholeheartedly. Linux distributions and car manufacturers are very similar: we’re selling products that reach the same basic audience (there are niche specialists in real-time or embedded or regional markets) with a similar range (desktop, workstation, server, mobile), and we use many of the same components just as the motor industry uses common suppliers. That commonality and coordination benefits the motor industry, and yet individual brands and products retain their identity.

Let’s do a small thought experiment. Can you name, for the last major enterprise release of your favourite distribution, the specific major versions of kernel, gcc, X, GNOME, KDE, OpenOffice.org or Mozilla that were shipped? And can you say whether those major versions were the same or different to any of the enterprise releases of Ubuntu, SLES, Debian, or RHEL which shipped at roughly the same time? I’m willing to bet that any particular customer would say that they can’t remember either which versions were involved, or how those stacked up against the competition, and don’t care either. So looking backwards, differences in versions weren’t a customer-differentiating item.  We can do the same thought experiment looking forwards. WHAT IF you knew that the next long-term supported releases of Ubuntu, Debian, Red Hat and Novell Linux would all have the same major versions of kernel, GCC, X, GNOME, KDE, OO.o and Mozilla. Would that make a major difference for you? I’m willing to bet not – that from a customer view, folks who prefer X will still prefer X. A person who prefers Red Hat will stick with Red Hat. But from a developer view, would that make it easier to collaborate? Dramatically so.

Another member of the audience came up to talk about the fashion industry. That’s also converged on a highly coordinated model – fabrics and technologies “release” first, then designers introduce their work simultaneously at fashion shows around the world. “Spring 2009” sees new collections from all the major houses, many re-using similar ideas or components. That hasn’t hurt their industry, rather it helps to build awareness amongst the potential audience.

The ultimate laboratory, nature, has also adopted release coordination. Anil Somayaji, who was in the audience for the keynote, subsequently emailed this:

Basically, trees of a given species will synchronize their seed releases in time and in amount, potentially to overwhelm predators and to coordinate with environmental conditions. In effect, synchronized seed releases is a strategy for competitors to work together to ensure that they all have the best chance of succeeding. In a similar fashion, if free software were to “release its seeds” in a synchronized fashion (with similar types of software or distributions having coordinated schedules, but software in different niches having different schedules), it might maximize the chances of all of their survival and prosperity.

There’s no doubt in my mind that the stronger the “pulse” we are able to create, by coordinating the freezes and releases of major pieces of the free software stack, the stronger our impact on the global software market will be, and the better for all companies – from MySQL to Alfresco, from Zimbra to OBM, from Red Hat to Ubuntu.

54 Responses to “Economic clustering and Free Software release coordination”

  1. Joey Wilson Says:

    I couldn’t agree with the idea of release synchronization more. It would be amazing if the open source and Linux community timed their releases. It would make major news and the entire world would wait with anticipation for the next big “release day.” Is there something like a “release alliance” similar to what Google has done with the Open Handset Alliance? Maybe it’s too hard to get everyone on board, but if some do, others will possibly join later.

  2. Tom Says:

    I totally agree.
    IMHO the only thing you that might work against a more coordinate development/release timeframe is ego ( developer or CEO etc., does’nt really matter.)
    The things you mentioned are economic or evolution based. The human ego does not a always work by a set of rules like the free market or evolution ;( .. so eventually we will have something like this .. but there has be a lot of persuasion/explaining going on for this happen.
    Not in this decade is my estimate.


    – Tom –

    PS. You have to extend your copyright notice at the bottom of this page to 2008 ( Or I will steal all your content 😛 )

  3. Peng’s posts for Monday, 28 July « I’m Just an Avatar Says:

    […] Shuttleworth: Economic clustering and Free Software release coordination. Back in April our favorite spaceman suggested that open source devs should release most of their […]

  4. Vadim P. Says:

    Caught me on the “Can you name, …, the specific major versions …?” part definitely. Most I know is the Ubuntu number.

  5. man88soccer Says:

    I definitely agree that the distributions need to coordinate releases and release same versions of the software. I think the linux community would also benefit a lot from merging many of the distributions serving the same niche. Also, distribution specific patches should be kept to a minimum. Send the patches upstream. That way all the releases can benefit from it, not just one distribution and the software testing over different distributions can be more comparable. Today, bugs that manifest on an ubuntu kernel may not even be relevant on a fedora kernel. Why is that? Too much time is spent duplicating effort in fixing bugs that have already been fixed in another distribution because a)the patch wasn’t sent upstream and b)the patch cannot be applied to this specific distribution because of too much divergence from the upstream project.
    If all this is achieved, I think that we might able to see many ISVs developing for linux and see linux and its free software on many commercial devices.

  6. Srinath Madhavan Says:

    Interesting comparisons. Could we get a video of the talk somewhere?

  7. Alan Pope Says:

    At LUGRadio Live last weekend during the (speaking slowly and clearly) Mass Debate (phew), there was a discussion about this topic – syncing releases/freezes. One point that was raised by the Red Hat community guy was basically that the only entity that synced releases benefits is us (the Ubuntu community). The argument goes that Red Hat is the clear market leader when it comes to Linux on the corporate server, and as such if they bowed to sync to your cycle (or close to it) this would mean you’d gain a competitive advantage over them. So why should they do it?

  8. Ben Collins Says:

    Nice piece Mark. A good thing to note is how this will help the developers upstream of all the distributions. If these projects are able to target every distro just by coordinating on timelines, then it makes it much more appealing for them to do this. It also makes it easier for their maintenance and bug triaging when they have fewer “variants” in flight at one time.

    Smaller projects will see the shift, and work to synchronize with it.

  9. Roger Lancefield Says:

    With reference to Alan Pope’s report above of the LUGRadio debate, wouldn’t the best projects to sync with be those which didn’t regard themselves as being in “competition” with anyone (allowing for “friendly rivalry”, of course)?

  10. Alan Pope Says:

    Are there any distros we aren’t in competition (for mind/market share) with?

  11. Roger Lancefield Says:

    Hi Alan,

    I guess it depends on one’s definition of “competition” ;-). I’m an avid Ubuntu fan, but I don’t feel that Ubuntu is somehow threatened or diminished by another distro’s success, and I sincerely hope that Ubuntu never acts purely to prevent another project from increasing its popularity. That would be ominous.

    Maybe this sounds naive and idealistic, but to me “competition” suggests things done purely to beat another project (or organisation, or product, or company). Ubuntu doesn’t need to worry about being better than others, it just needs to keep doing the stuff that has made it so popular. If it does that, the “market share” will take care of itself, IMO.

  12. Phil Housley Says:

    I assume the driving idea behind this is that competition between distros isn’t significantly a matter of release schedules. That would mean that the only major outcomes would be an increased sense of organisation throughout the Linux world, that could be easily projected to the outside.

    I don’t know the argument people use to say you would suffer from agreeing a shared schedule, but I can’t imagine a good one either. Actually adopting someone else’s would raise questions, but that’s really quite distinct from reaching a consensus.

    Alan, you mention that some people think there would be a disadvantage to a market leader, do you know why that is? It seems like there would be offsets to most potential issues; for example, you might get users saying “it’s upgrade time, maybe I’ll change distros as well, while there are new releases around,” but against that, whenever the competition releases, you have something new to fight against it with.

    As if I haven’t made it clear anyway, my view is that synchronisation can only help. New MS releases are staged as a big event, which can be used to build awareness in a big way. Distro releases just trickle out one at a time. If there was a major release from everyone once or twice a year, that could be used to let everyone know what progress was being made, and what they can now do that they couldn’t last time. That’s what people need to decide to switch, an event where something changes.

    Combine such events with improved interoperability, and you get regular opportunities for non Linux users to move to Linux, and know that they aren’t locked in indefinitely, or to any company in particular. The effect of that will be to increase the Linux market in general, and there are still lots of ways for each distro to try and be the one who benefits most from that.

  13. Christoffer Brodd-Reijer Says:

    Hard to see the real pros here, much easier to see the cons since it will be such a big project and it will require some hard work. But the benefits? They don’t shine enough to make it worth it, IMO. Maybe long-time benefits but even that’s no guarantee.

    It sounds risky and more like some sort of experiment. The lack of obvious, short-term benefits for all involved is the biggest drawback, right now.

  14. Tom Says:

    I don’t buy this competition argument. If Open Source companys start to duplicate efforts just to have an competitive edge over other Open Source companys then I am worried.
    If by working together each distro gets better and evoles faster and the Linux ecosystem as a whole will be more competitive with other offerings it will be good for every distro .. even RHEL.

    IMNSHO It should be about making the whole pie bigger and not just about your own piece.

    And I think Red Hat at its core is a Open Source company and they will be OK with this, because at the end it will provide a better RHEL and that is what they should really care about. Even if the benefits might not be evenly distributed among distros.
    I also think the market leader would benefit the most from a bigger pie 🙂

    Thinking is terms of “competitive advantages” is so 1990s and all those people with MBAs are wrong. FLOSS at its best works in fundamentally different ways.

  15. Ian S Says:

    Synced freeze/release cycles should have either no impact or a positive impact on the competition between projects. The things that make one project different from the rest will stay the same, regardless of release cycles.
    The overall competition between projects would remain intact, which is a good thing.

  16. Mike Says:

    Hi Mark,

    I have a website for you. This guy rants in a very nasty way (I find it entertaining, though) BUT he has very good points (it’s worth to read all his article, serious) and the Linux community especially leaders shouldn’t ignore those valid points.

    There you go: http://linuxhaters.blogspot.com

    Mike (who still like Linux, but admits it has some serious flaws)

  17. A. Peon Says:

    I appreciate the concern, and still appreciate the dream, but if I may be blunt, it can’t happen (potentially for equally “ecological” reasons). The stability of Linux itself does not come in any actual rhythmical cycle, as the lifecycles of each component vary on individual developers’ whims, and whenever it is getting “too stable” Linus will allow in some new API refactoring or ain’t-it-cool feature (schedulers?) to shake it up.

    Meanwhile, whenever anyone puts his or her hand up and shouts “Stop! We need to stabilize this!,” competitors will see an advantage and dive in. It happened with RedHat (gcc and glibc versions), it happened with Ubuntu (Debian taking too long, enough major components releasing regularly that its niche was born). Eventually, hopefully, all this thrashing morass of activity moves the entire FOSS codebase forward. (That seems to be the goal of Launchpad; it certainly doesn’t have any “synchronization” features visible to the end-users, let alone changelogs or easy cross-references to known-good versions or anything like that. Everyone splashes around, and hopefully, *in the next version,* bugs get fixed.)

    Ubuntu’s goal of regular releases is a laudable one — *someone* has to build the “desktop stack” as it stands and see what’s working. But distribution, especially when no one “cares enough” to go back and solve every single problem (trying to do so did Debian in, and a lot of Ubuntu developers care but have to work in finite time spans), is a lot more like photography than forestry. Tree DNA has had a billion years or so to “stabilize” into the impressive harmony you observe, and nature only rarely attempts new versions, most of which die. Packaging software, on the other hand, especially software that’s under constant development over which you can’t exert control, requires careful “snapshotting,” and the quality of the initial snapshot determines what you can get out of the image in postprocessing.

    Since Ubuntu wants to release in regular cycles, and not get bogged down in the patching-the-patches minutiae that hobbled Debian, maybe it makes sense to take a cue from photographic workflows: Professionals don’t often take one exposure and pray; they shoot a roll and cherry-pick their best work. If we can agree that Hardy was a bit of a misfire (perhaps it is supportable, on the three pieces of hardware it may have properly supported at release), maybe it’s better to cherry-pick the LTS out of the last year’s releases, and *then* commit to supporting the one that worked right, had the fewest showstopping kernel bugs, etc. [Quantifying this could be an interesting experiment; if the basic goals are to release something that can, say, boot to Gnome, access network and ATA devices, and print a kid’s homework without malfunction, surely there’s some way to instrument for that? Launchpad needs a Spaceprobe.]

    That, in itself, would help people calm down and “synchronize;” rather than expecting the Next Big Thing to solve all their problems, they’d have to plan to be backwards- and forwards-compatible to play nicely with the eventual winner. Meanwhile, users actually expecting (or tasked with implementing) reliability would have familiarity from before the “LTS stamp” was applied, and the known outstanding issues needing patches to reach “five nines of satisfaction” could be attacked (and maybe just backported) by the release team.

    Sure, this would force FOSS development to slow down a little, and LTS releases wouldn’t be (gasp) “cutting-edge.” But they’d still come out on a regular schedule, and if those trees can take a million years (and lose a couple “mutants”) to get it right, why shouldn’t we wait an average of 6 months? Instead of looking forward while standing still, keep moving but drop off a “landing party” to go back and haul in a good build.

    [My unfortunate luck with Hardy colors this response, but that means I’m *really* motivated to help improve the process!]

  18. Peeter P. Mõtsküla Says:

    Competition changes over time. Like it or not, all Linux distros currently compete against Microsoft and Apple for their share of the desktop market. Linuxes are small and the market is huge and the “enemies” are strong so it pays to team up. But as soon as Linux will become mainstream, there will be fierce competition between different distributions.

  19. Luc Says:

    If Ubuntu wants this to work they can sync with Redhat or SLED, I don’t see any reason why this couldn’t be done they wanted it sincerely. It will obviously not happen the other way around for competitive reasons.

    Mark Shuttleworth says: Luc, if Ubuntu felt the most important anchor in the ecosystem was SLED or RHEL, we could align with them very easily, as CentOS does. Instead, we felt it important to align with upstream. We chose GNOME, because of their predictable time-based releases. And since Fedora recently also aligned with GNOME, we are in the happy situation of being able to exchange patches around GNOME with Fedora much more easily. At the six-month-release level, Ubuntu and Fedora are well aligned. The question is whether we can achieve similar coordination at the level of our LTS releases, and Red Hat’s RHEL. You seem to suggest that Red Hat would not do this, for purely competitive reasons. I tend to think that getting alignment is good, not bad, for everyone. So the question then is “who should align with who”. In that regard, I have said I don’t really mind where we end up – if we have a conversation about this across upstreams and distributions we will probably figure out something sensible for everyone. That might involve shifting Ubuntu’s plans, and we’d be happy to do that if it would reflect consensus within the ecosystem. We’re not trying to get people to shift to accommodate us, we’re trying to catalyze a conversation across the whole ecosystem, and let things settle where they do.

  20. Sander Says:

    “Would that make a major difference for you? I’m willing to bet not – that from a customer view, folks who prefer X will still prefer X. A person who prefers Red Hat will stick with Red Hat.”

    Actually, I believe it makes a difference for the customer. Especially regarding software that is mission critical for this customer. If this software is synchronized, this creates a large amount of trust in the software. This will make it easier for the customer to upgrade to new versions of this software (deployment, training for the new version, testing, and so forth). This added trust will make the software more valuable for the customer and hence he may choose this software over alternatives that are not synchronized.

    In the long run, massive synchronized may also result in commercial software vendors, hardware manufacturers, and other industry players to join the synchronization cycle. Imagine these things released at the same moment:
    1) Linux distributions
    2) Linux kernel and new X.org release with a new hardware driver feature
    3) ATI graphics card that uses this new hardware driver feature and that is fully supported on the new kernel
    4) Asus laptop with this new ATI graphics card and a new Linux distribution
    5) new game release by Ubisoft Entertainment that uses the new feature in the ATI graphics card for better game graphics

    As you can see, synchronized releases have the potential to transform Linux in the #1 choice gaming platform! B-)

  21. Frederik Says:

    Competition as a model and dogma for human society, because “it’s efficient and part of nature”, is as questionable as promoting cannibalism (common among ants and crabs and justified by biologists). Modern models have shifted from Darwinism to collaborationism, which turns out to be far more common and efficient than was ever thought. The focus should remain on progress and excellence, and on collaboration as the best means to that end.
    Recommended reading: “The Evolution of Cooperation” (R. Axelrod), based on experimental computer gaming.

  22. Mike Says:

    @ Tom: “If Open Source companys start to duplicate efforts just to have an competitive edge over other Open Source companys then I am worried.” Where are you living? EXACLTY that happens right now. Freedom of choice does not ultimatively lead into better software. Way too many forks or new creations are around the Linux ecosystem and way to less cooperation….every developer thinks he has a better idea or will solve it better…how many one-man shows exists out there? Cooperation has still not arrived (with very few exceptions). The Linux ecosystem is still way to much developer-driven, meaning many projects don’t have the user in it’s focus, but their selfish masterplan to create something better (from their perpective). That’s wrong and has to be changed as the attitude of “Then Linux is not for you” or “Fix it yourself” and so on (it’s better than 10 years ago, but does dinosaurs are still out there).

  23. Sameer Verma Says:

    The way I see it, there are three different “streams” that need to be addressed. 1) Inbound projects like GNOME and KDE conforming to a time-certain release schedule. GNOME is released once every six months, but KDE is not. 2) Distros produced by companies or communities to be released in a time certain fashion. Ubuntu is released once every six months (except for 6.06) but Debian is not. 3) Organizations (both community and corporate) that release distros should agree on releasing distros in a similar time frame (month or quarter).

    Item 1 will make decisions easier for item 2, which *may* lead to item 3. The only major drawback I see is that time-certain release schedules will put a pressure on the quality of the product, and if the producers are not used to the concept of combining time and quality, the product will be sub-standard. Thus far, I haven’t seen any major glitches in the quality of Ubuntu, in spite of the time pressure. The only slip-up that I’ve noticed was with 6.06, which was supposed to be 6.04, but was pushed by two months to improve upon the LTS characteristics (AFAIK). In a sense, quality is (and should be) the first constraint, but now we are adding on a second constraint – time.

    As far as competition is concerned, remember that the competitive space contains other players including the non-FOSS ones. Competing with those entities will require some amount of certainty. Additionally, in that competitive space, its not just the distro that matters. Its also the marketing, sales and service that makes the difference. For those who simply use the distro without support (like me) we could care less. But for those (customers) who do require 24/7 support service, having that certainty is critical.

    PS: I don’t have an MBA 🙂

  24. Sander Says:

    @A. Peon: The idea you suggested in your comment to pick the LTS release at the end of the year is definitely bad. Instead of creating certainty and trust for end users, businesses, and coders, as synchronized releases do, picking an LTS release at the end of the year creates uncertainty and this is not good of course. Certainty and trust add value for the end user; it makes the software more interesting for them. Uncertainty does the opposite.

    Besides that, there seems to be a misunderstanding about the concept “LTS”. An LTS release is not meant to be more or less stable/secure/sexy/whatever than any other regular Ubuntu release. It only differs in the fact that you get support for much longer. There can be more bugs and flaws in an LTS release and that is no problem! The advantage of an LTS release over regular releases is that you can get support for several years instead of several months (I don’t exactly know how long); you don’t get more quality with an LTS release. Summary: LTS releases suck as hard as any other Ubuntu release!

    @Mark Shuttleworth: Maybe you should make clear in a blog posting that an LTS release is of the same statistical quality as any other Ubuntu release? Just the life of support differs. It seems many people have problems with this concept.

  25. MaryBeth Says:

    While it is difficult to hit a release date on-time and with few bugs, I think it’s an important goal. Since Gnash is already on a 3-month release cycle (or goal, as I might put it more realistically), we are making our best attempts to sync with Ubuntu’s freeze schedule. Although the inherent freedom of a FLOSS project can hamper delivery of an on-time release, it is my belief that, if more upstreams put in the effort, we will all benefit from frequently tested, up-to-date, stable distros.

  26. A. Peon Says:

    @Sander: Your argument makes sense from a business perspective, if all you care about is selling support.

    “There can be more bugs and flaws in an LTS release and that is no problem!” — Yes, in fact, introduce as many bugs and flaws as you can, then you’ll sell that many more support contracts for the longer term!

    I don’t think that’s really Canonical or Ubuntu’s goal, though. Bugs and flaws drive customers away, especially if other distributors’ “snapshots” (taken at different points in the flurry of FOSS development cycles) don’t exhibit the problems. That hurts the case for cooperation that Mark’s trying so hard to make here. Then, “supporting” flaws takes talent and resources that could better be occupied elsewhere, and there’s the catch-22: if it didn’t work in the original release, how much “support” are you entitled to for your issue, whether now or three years from now?

    I’d pay someone for .debs I could drop in to solve the ATAPI compatibility bugs and floppy automounting quirks in Hardy. But I don’t think anyone’s making that proposition, and it’s going to be fixed in Intrepid anyway, right? [The ATAPI issues, where some *-ROM drives were rendered suddenly incompatible, seem poorly understood but also seem to be fixed in 2.6.25. The floppy automounting, where the Gnome stack seems to have begun looking for MS Logical Disk Manager partitions and nothing else, is probably related to GVFS churn, so patching it might require backporting Gnome components that you’d hope shouldn’t be touched in a LTS release.]

    Satisfaction and morale among both the supporters and supported is improved the less supporters have to tell supportees “That’s broken.”

    Now, in defense of my proposal:
    “Instead of creating certainty and trust for end users, businesses, and coders, as synchronized releases do, picking an LTS release at the end of the year creates uncertainty and this is not good of course.”
    Putting up with surprises in Hardy hasn’t created certainty or trust for me. If you study usability, and indeed just users’ general reactions, familiarity can be an extremely important factor in implementing systems that can be successfully used. Not making gambles on uncertain promises — that an unreleased version is going to be particularly “supportable” by project staff or anyone else — would get rid of a lot of uncertainty. This would save a lot of stress and heartache, with the side effect of forcing people to evaluate each new version properly because it might become the LTS — catching more issues and achieving more polish before the blessing comes down from on high.

    There may be no guarantees, but people are going to take the closest thing they can get to a guarantee — the LTS label — if there are no guarantees! So why not save the supporters the burden of committing to support something unknown? It’s not about slowing down the release cycle, it’s just about putting LTS on “tape-delay” so hindsight could have an effect. (If both of the past year’s releases were troubled, they’d still have to pick one and support it as least-worst. But executive decisions, like which kernel version to ‘LTS’ with, would be far less painful to make.)

  27. A. Peon Says:

    A separate thought: Pardon the pun (heck, who am I kidding?), but are we missing the trees for the forest here? Ubuntu is one of the few distros jumping off a cliff every six months (and N years), per Gnome, and it’d be nice to have company. Meanwhile, cooperation is nice, and in the words of some great thinkers, “we all do it when we can.” But this is FOSS-land, where great ideas, like the miracle gaming feature Sander mentions, have to win on their merits — and where, we hope, winning on the merits keeps us relatively honest and our software relatively stable.

    So… Mark, everyone here can put up an argument as to the pros and cons of synchronization, but how do you propose it really should work? If [RedHat|Novell|The FreeBSD Foundation…] were to agree to cooperate more fully, and pay more attention to sticking with identical versions so patches could flow more freely, how do those versions get picked?

    For Linux itself, Linus still has his say — and I certainly expect he’ll continue to accept curveballs that throw off even the best-intentioned release cycles! But for everything else, how do you propose consensus gets reached at the particular point in time? Does the calendar itself rule, and everyone just takes the last-declared-release of a package without regard to its developers’ idea of what “release” means, driving the responsibility upstream? Is there democracy, or veto power if a necessary component is particularly broken?

    If you propose a process, people will accept, reject, or critique the process on its merits, rather than reacting to the concept of synchronization in general. And if the rabble does shoot it down, you’ll have learned something for Process 2.0. 🙂

  28. Sander Says:

    A Peon: People don’t want to wait until the end of the year until the LTS choice is known. Especially for businesses who plan a big deployment this is a big problem. Also, the chances are very high the first release that year will be the most stable at the end of the year. Simply because it already saw more patches over the year.

    Anyway, Ubuntu should not change its release policy. It only needs to bring down the variation in quality. In statistics, the quality of Ubuntu releases is a normal distribution as can be seen on http://en.wikipedia.org/wiki/Normal_distribution . What Ubuntu needs to do is to bring down the variation in quality (so the quality of every release has to be as similar as possible). E.g., they should move from the red graph of the first figure on the Wikipedia page, to the blue graph. That is what trust is about.

    Improving quality is what any industry does. Actually, the idea you propose, is like a car company launching 2 different cars each year and then at the end of the year looking which car caused the least accidents to see which one they keep on the market for the next few years. Luckily, there is no (car) company that works like this 😉

  29. Craig Says:

    I do care that I’m using a version of Open Office that will open the files that others send me. And I wouldn’t mind having a newer kernel to fix issues… although having a newer video driver means I’ve lost features. So do I care what version I’m using — no, assuming everything works 🙂

    I like the idea of sycronization, in so far as it reduces efforts all around and hopefully improves quality. I appreciate that you (Mark S.) are taking a leadership role as expected, but if everyone isn’t on board I’m not sure it is an issue. Have as many people on board as possible and those that choose not to be, well – they will observe if they made the right choice or not.

    I think a significant issue of cadence, as mentioned already, is the tendency to re-invent the wheel. Without a detailed analysis it’s difficult to know when it is actually generating new ideas and healthy competition versus when it is just wasting resources. [Personally I think of the multiple – yet horribly incomplete or incompatible – hardware reporting databases out there as a sign of something that is of significant interest for everyone to get together and fix yet it seems like there is no truly active project out there and I’m not quite sure why]

  30. alex_mayorga Says:

    Mr. Shuttleworth why not tackle the “packaging coordination” first and work on the calendars latter?
    My 2¢

  31. A. Peon Says:

    I still hate to choke the thread with this discussion, and I wish I’d thought of this back when the LTS post was current… oh well.

    @Sander: A couple rebuttals:

    You say that “people don’t want to wait” and that businesses should plan deployments around LTS releases, yet you also say LTSes shouldn’t be anything special, so presumably it would be foolish for a business to plan around LTS releases. (Hand up, I was a fool!) Note that, under my proposal, LTSes would still be announced at a fixed time, so I fail to see how this disrupts business planning.

    I work for a small business, and we got bit; now we’ll probably move to Intrepid (after evaluating it much more carefully) because there’s a good chance much more of it will just work. Hardy seemed to go through a half-dozen kernels in the weeks before release and there just wasn’t time to evaluate and file reports for each and every one of them, not to mention that there wasn’t time for everyone to triage valid reports given the flood coming in.

    “Also, the chances are very high the first release that year will be the most stable at the end of the year. Simply because it already saw more patches over the year.” What’s wrong with that, if the goal is quality? But if that first release was a lemon for some reason (which, of course, is coupled to the timing issues Mark complains about — Firefox wasn’t out of beta, 2.6.25 was still in rc, but with the LKML’s attention on 2.6.25, 2.6.24 obviously missed a lot of love from everyone not on Ubuntu’s cycle…) it might be that the newer release has already absorbed more “patches” from upstream, and skipping the one that requires a lot of backporting to fix will make everyone happier on the whole.

    Quality doesn’t come out of a vacuum. Redrawing a graph is great, but the question is *how* to redraw the graph — you can make it policy to “bring down the variation in quality,” but actually doing that is another story. I fully understand that Ubuntu’s goal is to release regularly and often, and not to get crippled by conservatism. But since most of the advantage of conservatism is in the hindsight and familiarity, doing things this way would give you the benefit of a “stable” fork without the cost of actually maintaining one. (Do you realize how much effort is going to have to be put into Hardy over the next N years, just because it was the LTS release? What if those developers could be tackling problems that weren’t already fixed upstream?)

    The man-hours that went into Hardy were entirely necessary (and much appreciated) to keep Ubuntu current and ensure the greatness of future releases, but does it make sense to commit to rubbing those same developers’ noses in it ahead of time? Especially when they were working with what they had from upstream? When it turned out that, as became obvious when it really got down to the wire, upstream was about to abandon major issues in some of it to focus on upstream’s N+1s? And if upstream’s N+1s, combined in the next release, turn out to have clearly fewer issues overall?

    The auto industry does not drop its newest, most experimental powertrains into its sedate family sedans every year. It looks at what’s been working over recent history, then puts in what seems most likely to be “supportable” for their more risk-averse customers. (I’ve heard that Oldsmobile, for much of its later existence, was basically the “less-supported” proving ground for what would become Cadillac’s engines and transmissions.) Sometimes newer products have such significant advantages that they wind up in a Buick or Camry; other times, more testing is required.

    Because of the schedule and Ubuntu’s specific goals, I’d like to think of the regular releases as “concept cars” (here put out for show every 6 months — and through the miracle of FOSS, anyone who doesn’t want to wait can pick them up for daily use) and the LTS as the models that get blessed for “production” (for the risk-averse and corporate customers). This seems like a healthy perspective; thinking of them as separate “models” can lead to forked trees and duplication of effort, but if you promise perfection (or disclaim perfection, but then tell people it’s “LTS” anyway) ahead of time, every once in a while you’re going to end up with an Edsel.

    Also re: quality, not telling *developers* that they’re working on an LTS means that they have to approach every release as a LTS. How’s that for smoothing out the graph?

    I just hope the next LTS comes out with a kernel that’s had more than (what seemed like) 12 hours in beta. 😉 That way I, for my risk-averse applications, would have a clear idea of what issues would exist and deserve workarounds. [For the silly issue mentioned, the ATAPI compatibility thing… I *could* invest hundreds replacing every ODD in the office, but it’s still not clear if there’s a list of which models *work.* That’s the kind of information I’d expect to get if asking for “support.” Is that information in the top-secret Canonical-only section of Launchpad? I don’t know, but I doubt it, since I assume Canonical is working from the same playbook as everyone else; should I pay to find out? Seriously, this is the kind of information support personnel need to have, and they need to observe a release “in the wild” to acquire it.]

  32. 451 CAOS Theory » 451 CAOS Links - 2008.07.30 Says:

    […] Economic clustering and Free Software release coordination, Here Be Dragon, Mark Shuttleworth (Blog) […]

  33. Sander Says:

    “not telling *developers* that they’re working on an LTS means that they have to approach every release as a LTS.”

    That’s what they already (should) do. If they don’t; it’s really time for a blog post from Mark Shuttleworth to clarify this misunderstanding…

    “I just hope the next LTS comes out with a kernel that’s had more than (what seemed like) 12 hours in beta. 😉 That way I, for my risk-averse applications, would have a clear idea of what issues would exist and deserve workarounds.”

    You can already do this: schedule your deployment of next LTS like this: (LTS release date)+(12 hours)=(you deployment date). If there were severe bugs found the first hours after the release, they may already have been fixed. Of course most companies with not count in hours, but in months. So, for instance, next LTS is on April 2010 and the company will deploy it in June. The more risk averse you are, the more you delay your deployment of the LTS release. Note that Dell already did it like this: 8.04 released in April and actual deployment July ( http://arstechnica.com/journals/linux.ars/2008/07/19/dell-begins-rolling-out-ubuntu-8-04-adds-media-codecs )

  34. A. Peon Says:

    Huh, didn’t realize you could do indented quotes here, what’s the trick for that?

    >> “not telling *developers* that they’re working on an LTS means that they have to approach every release as a LTS.”

    > That’s what they already (should) do. If they don’t; it’s really time for a blog post from Mark Shuttleworth to clarify this misunderstanding…

    Fair enough, but out in reality, there were some discussions about what would/wouldn’t go into 8.04 as LTS because it was going to have to be supported for so many years. That’s also why Firefox 3 and so on went in there in the first place — it was going to be the LTS and the LTS shouldn’t be stuck with packages that were known close-to-EOL ahead of time. [FF3 hasn’t been much trouble, what if another package had been? What if the “final,” “stable” version of a beta package requires new dependencies?]

    That could become a real problem someday, seriously. Ubuntu (Original Recipe) is a Gnome distro and, conceptually, Gnome asks people to take advantage of it as an environment and link against it. If people actually sign onto that, it’s going to get harder and harder to backport fixes without backporting huge chunks of Gnome. [Or, of course, you end up “supporting” an issue by telling people it was fixed upstream and you can’t run upstream without leaving LTS.]

    >> “I just hope the next LTS comes out with a kernel that’s had more than (what seemed like) 12 hours in beta. 😉 That way I, for my risk-averse applications, would have a clear idea of what issues would exist and deserve workarounds.”

    > You can already do this: schedule your deployment of next LTS like this: (LTS release date)+(12 hours)=(you deployment date).
    > Note that Dell already did it like this: 8.04 released in April and actual deployment July ( http://arstechnica.com/journals/linux.ars/2008/07/19/dell-begins-rolling-out-ubuntu-8-04-adds-media-codecs )

    Lessons learned here, certainly (I’m supposed to know better!). But, y’know, announcing that something is available creates demand, and for users who want to “trust” the project, they may assume it’s had enough QA to… y’know, work. A truly huge OEM like Dell is going to take the time to evaluate anyway, but notice they still don’t offer 8.04 on many of their products, presumably because it still doesn’t work properly on those products. Their customers are probably disappointed (“8.04 has been out for months!”), and they’re probably frustrated too. (Speaking of cooperation: I’m sure they’d love to, but how much should they invest in helping to patch 8.04 if 8.10 just doesn’t have the issues?)

    You can say ‘that’s how it is, get used to it,’ but if the LTS stamp got applied in February or August, the Dells of the world would probably have some very useful feedback. (“Hey guys, when you pick the LTS… just to let you know, 2038.04 still isn’t working on half our hardware, but 2038.10 has been pretty smooth. Looks like the Quantum-Entangled USB 64.0 support was broken in kernel 18.4.3 and caused deadlocks with all the major Septium 8 and Decathlon chipsets, and that patch depends on the new Uncertainty Principle API in 18.5.1. Which, if you remember, was 16TB of patches that rely on the new scheduler hooks linus’s_head_in_a_jar@kernel.org applied in June. Even with our seven parallel pocket universes, it’d going to take a few hundred years to backport all that.”) Which helps all the users who *don’t* have the resources of Dell, but will still need to upgrade sooner or later.

  35. Sander Says:

    No trick, it’s called HTML markup code 😉 <blockquote>something to quote</blockquote>

  36. Guanglin Du Says:

    Completely agree with dear Mark on what he suggested here. We can imagine what this will bring about to the whole IT industry as does the regular Ubuntu release cycle to the Ubuntu community.

  37. CyberCod Says:

    this is off topic, but I’d like to see Ubuntu come pre-installed on Playstation 3’s. Of course this would only work well if Sony were to allow 3D acceleration from within Ubuntu. I think it would be mutually beneficial to both Sony and Canonical. ::think about this for at least a full minute before reaching for your flame throwers::

  38. ed whymandesign.com Says:

    Hi Mark

    I love the work that you are doing and hope it is possible to talk to
    someone regarding the inquiries below (based round sustainable funding for OS development and philanthropical projects)…


    If you know of anyone that is involved with open source business models and are
    interested in philanthropical business models such as http://www.Traidmark.org
    I would love to meet them.


    I am also researching games or charity work that links up young people
    and am trying to get in contact with organisations that either link up
    kids or educate them (especially ones that use the linking power of the web coupled with real life activity well). This is
    because an international broadcaster is making a game that will operate
    in this area and are looking for partners. Do any spring to mind?


  39. Sachin Says:

    Obviously the benefits of coordination outweighs its cons…….moreover it will bring more discipline in free software world and better management.

  40. A. Peon Says:

    @ CyberCod: This is pretty much everyone’s dream, but the big game console manufacturers all have the same problem: They sell the consoles at or near a loss (supposedly) and make revenue from SDKs and licensing deals. Putting a “real OS” on the hardware risks developers targeting their games to it, and so an impasse has been reached. IMHO the workaround, if we want to preserve their business models, would have to be a Habeas-type solution (the spam-filter-EZPass haiku people), where hitting the hardware requires use of a token that they can litigate over if it gets used for gaming purposes. But that gets hairy for obvious reasons: you probably couldn’t do a shared libmesa, etc. Plus, if they ship it, they actually have to support it.

    The Amiga Research Operating System actually landed on a STB a while back because the manufacturer said “why not?” Similarly, Linux has a lot of wins on hardware like the Eee because it’s enabling and has nothing but benefits for the makers. If you want Linux on a console, look for the up-and-comers… and write some decent game-development libs, so building a next-gen Indrema will be a no-brainer for the sort of people who’d make a Vii.

  41. Blog de Bernard Opic » Archives du Blog » Regroupement économique et coordination de la production des Logiciels Libres Says:

    […] française de l’article “Economic clustering and Free Software release coordination“. Auteur : Mark Shuttleworth – Traducteur : Bernard […]

  42. CyberCod Says:

    Honestly, this idea doesn’t sway me one way or the other… but I would like to point out that the 6 month release cycle seems to be hurting Ubuntu. Hardy has been the most problematic release I’ve used yet, and I would have thought it would be the opposite since it is a LTS release.

    While I admire your dedication to Open Source, Mr. Shuttleworth, I really feel like you’re going to burn out your developers with this artificially accelerated release cycle. I’d rather wait 9 months between releases and have everything work properly. I’ve personally found Pulseaudio to be atrocious and Firefox 3 beta 5 was very irritating until it was out of beta. I’ve gone back to Gutsy for the sole reason that I want to be able to record audio and I just couldn’t accomplish this with Pulseaudio. Removing Pulse left me with a severely scrambled system that was pretty much unusable from an audio standpoint.

    While I’m sure this sounds like I’m just ranting about Pulse, this is truly only the current interation. Pushing yourselves to the wall to get things released on time only creates opportunities for disaster. Its much easier to destroy a reputation than it is to build one.

    What flaws will be introduced with Ibex? Could they be avoided with a few more months of development time?

    If EVERY distro adopts the same release cycle, there may be some benefit. BUT ONLY if the release cycle is a sane one that all the distros can reasonably accommodate.

    Being that linux is generally as stable as an operating system can get, your release cycle is so short that I could feasibly never reboot except during version upgrades.

    I don’t need a new version every six months. I don’t know anyone that needs that.

    Calm down man, pace yourself. Until you do, you’ll never get the other companies to agree to synced cycles.

    Thanks for listening to one man’s opinion.

  43. CyberCod Says:

    @ A. Peon

    I don’t agree. Having a gaming console as powerful as the PS3 with a full fledged OS on it would be awesome for both the distro used AND for Sony. Right now, gamers don’t feel that there is anything being offered by Sony that is above and beyond what they can get on the competition. And in the case of online content management, dashboard and whatnot, there’s a severe perception that Sony is hanging itself slowly.

    Putting a fully working OS on the PS3 would entice more people to buy a PS3. Once the PS3 is owned, it opens up blueray usage as well as all the proprietary games that just are not available on Linux. While there are a few really great games available for Linux, there really aren’t any that compare to the current day console titles. As well there’s the problem of storage. A downloaded and installed Linux game would take a large chunk of the system’s space. One couldn’t feasibly load up on a bunch of heavy titles. One could emulate games from previous consoles though (so long as you own the title in question) which would enable the PS3 to rival the other consoles as far as classic gaming goes.

    People would still buy the regular PS3 titles because thats where they’d find the good gaming.

    For years now the console makers have been blowing horns toting the idea that gaming consoles would become the informational center of the household. This is only possible with a full OS. With support for HiDef televisions, as well as USB support, A PS3, equipped with Ubuntu could actually be the only computer that a household needed. Maybe not for some hardcore users, but for many, it would be fine. It would sell a lot more PS3’s which would in turn sell a lot more PS3 games and blue-ray discs.

    Also, keep in mind that Microsoft is their competitor as well. Not just ours. Banding together with Linux seems like it would be a no-brainer. The enemy of my enemy is my friend. Not only would they better compete with the PS3, they would also be putting up a large burning sign to the world that there is an alternative to Windows. Microsoft wants you to buy two machines. A computer running Vista, and an Xbox 360. Sony could accomplish both with one machine. One machine that is extremely powerful, and amazingly versatile.

    This idea doesn’t need naysayers. It needs someone with clout to pitch it to Sony.

  44. zelrik Says:

    Hi there,

    I think coordination between different GNU/Linux distributions is a very good idea. That will show to the rest of the world that the GNU/Linux Community knows what they are doing. Something I am wondering about it why ubuntu has this 6month timing between different releases? It’s nice in some ways, but I see bad effects too, I am still recovering from my upgrade to hardy heron :/ . Is there a way to make this more flexible? like variable cycles which vary from 2 releases per year to 1 release per year and the choice between the 2 could be decided based on the size of the ongoing projects.

  45. Aesop’s fable on Standards, RMS and Selling Free Software, Release Coordination: links 25-08-2008 | Commercial Open Source Software Says:

    […] Economic clustering and Free Software release coordination – Mark Shuttleworth believes that for a stronger impact of open source on the global software market we need to coordinate the releases of major pieces of the free software stack. […]

  46. Socceroos Says:

    Mark, Ubuntu’s art team needs attention. I appreciate the work they’re putting in, but the end result is pretty stale. Nothing has changed that much since Dapper Drake.

    I’m not sure if its a lack of talent, manpower or leadership – but if you meant what you said in your keynote the other week, then something needs to be done about this.

  47. mehdi Says:

    Hi Mark,

    In the LTS release time I think it’s better that Ubuntu have 2 releases. First LTS and based on bug fixed previous release and Second the normal release, This way I think Ubuntu LTS will have less bugs and be more robust and stable… for example in April 2008 you could have 2 releases. 1. Ubuntu 8.04 LTS (based on ubuntu 7.10) and 2. Ubuntu 8.04…

  48. Eduardo Willians Says:

    When will you speak more about market and macroeconomy like that excellent post titled “Economic oversteering”? Thanks.

  49. Drew Kwashnak Says:

    One advantage to distributions not releasing at the same time is the Marketing buzz!

    It increases with each different Distro’s release and people talking about “this” or “that” advantage (or not) the newly-released distribution has that others do not.

    Having a staggered release cycle will allow one distribution’s new release create marketing buzz while another is working on their next release.

    On the technical side, it also allows a distribution with a particular focus (like the desktop environment, KDE, the kernel, interoperability, etc.) to come out with their achievements in a real-use scenario in which the other distributions can take or leave the parts that interest them.

    I do, though, find the comments and similarities with the car industry and with tree seeding very interesting and think it is still viable to look into.


  50. Ricky Says:

    There’s no doubt in my mind that the stronger the “pulse” we are able to create, by coordinating the freezes and releases of major pieces of the free software stack, the stronger our impact on the global software market will be, and the better for all companies – from MySQL to Alfresco, from Zimbra to OBM, from Red Hat to Ubuntu.