Read on for something that’s either hilarious or baffling, depending on which IRC channels you’ve been hanging out in. 5 hit die Lich indeed. And that’s just the left hand.
Archive for April, 2007
One of the very interesting issues-du-jour is the interaction between the three “legs” of “intellectual property”. Traditionally, those three are copyrights, patents and trademarks, and they have quite different laws and contractual precedents that are associated with them.
Recently, however, I’ve observed an increase in the cross-talk between them.
Classically, “software freedom” was about the copyright license associated with the code. But patents and trademarks are now being brought into the mix. For example, the discussion around Mozilla’s trademark policy was directly linking the concept of “freedom” to trademark policy as much as code copyright license. And much of the very hard debate in the GPLv3 process is about linkages between copyright license and relevant patents. And like it or not, the GPL is widely considered the reference implementation of freedom so GPLv3′s approach will be, for many, definitive on the subject.
In the Ubuntu community we’ve recently gone through a process to agree a trademark policy. This was recently approved by the Community Council, and the final draft is here:
We’ve tried to strike a balance that keeps the trademarks of Ubuntu meaningful (i.e. if it says Ubuntu, it really is Ubuntu) but also recognizes the fact that Ubuntu is a shared work, in which many different participants of our community make a personal investment, and which they should have the right to share. So we’ve made explicit the idea of a remix – a reworking of Ubuntu that addresses the needs of a specific community (could be national, could be an industry like medical or educational) but preserves the key things that people would expect from Ubuntu, like hardware support and certification.
I’m sure this isn’t the last word on the subject, but I hope it’s a useful contribution to the debate, and would welcome other projects adopting similar licenses. For that reason, our trademark license is published under the Creative Commons Sharealike with Attribution license (CC-BY-SA).
Ubuntu members, it’s time to expand the Community Council! After much discussion, we have five candidates for the CC-2007. In my capacity as project BDFL I’m nominating each of them for a 2 year term, and need your approval to get them onto the council.
There are 5 separate polls, one for each nomination, and all Ubuntu Members (developers, advocates, artists, forums contributors, whatever their contribution) get an equal say.
I very much hope all of these candidates meet your approval! We will be able to cover more time zones and respond more quickly to community issues with this expanded governance board. That said, the CC is the most important body and you should feel empowered to vote accordingly if you feel a candidate is not suitable. It’s a secret ballot.
The doors are finally open to register for Ubuntu Live, our first global Ubuntu user conference. It is being hosted by O’Reilly Conferences in the prelude to OSCon in the same venue, and exists “to provide a meeting place for Ubuntu users, contributors, and partners–and the Ubuntu-curious”.
The list of sessions is already impressive (we had a phenomenal set of papers and presentation proposed, but had to whittle it down to this initial list). I’m sure there will be some additional sessions too. But the thing I’m most excited about is the list of speakers, so you will find me in the front row for every keynote on those dates.
Let’s meet up in Portland!
I’ve long believed there’s a general phenomenon that underlies the free software movement. It’s “volunteer-driven, internet-powered collaboration”. I think it will ultimately touch every industry that has any digital workflow. Lets face it, that’s pretty much every industry.
The phenomenon has three key elements:
- Freedom-driven licensing. If you want the magic, you have to set it free, because it’s the possibility of doing things for themselves that motivates people to build on your work. Just exposing the “source” (whether that’s code or other content) isn’t as interesting. Microsoft will show you the source to Windows these days, but they won’t give you the freedom to remix it.
- Community. The net allows us to build a community of eyeballs and fingers based on personal interest rather than personal geography. It used to be that companies always had to do the best they could with local talent – or fly people in and deal with visa issues (that’s why Microsoft is a big proponent of greater H1-B visa allocations). Today we can find the best talent wherever it is, talent that is really personally interested in the underlying issue. And we call that talent pool “community”.
- Revision control. I’m much happier to give you read AND write access to my stuff, if I can know who changed what, when, and easily revert it. And if that revision control allows cheap branching, then there can be multiple, parallel efforts to solve a particular problem.
Consider wikipedia in this light: it clearly meets all three criteria. Its content has a license that gives you genuine freedom. There is a big community that takes a personal interest in that content (actually, multiple communities, one which I call “the librarians” wants to make sure the institution itself is healthy, the others are communities that form around specific content, given the nature of wikipedia as a repository of knowledge). And of course every change is logged with some level of identity associated with it.
The linux kernel is the same, as are most of the components we associate with a GNU OS.
But why stop at just code and knowledge? I’m a big fan of the work of the Creative Commons, because they have taken to heart the idea of generalizing the licensing problem. And conferences like the Digital Freedom Expo in South Africa this week, which TSF has agreed to sponsor, are forums for discussing the ways in which these principles can apply to other domains. I would love to be part of the exploration of this phenomenon at all levels but Ubuntu is plenty of work for one lifetime. Nevertheless I think there are real opportunities, both social and commercial, in this idea.
Incidentally, one of the reasons I picked the Bazaar revision control project for use in our infrastructure, and why I sponsor it, is because I think it will be great to have a revision control system which can be adapted to manage LOTS of different kinds of content, not just code. And the Bazaar guys abstract things to an appropriate level to be able to do just this. I’d like to be able to see a house I like, and “bzr branch” the plans to that house, then share my modifications together with my experiences of living in that house so that others can merge the ideas they think worked best. All we need is bzr embedded in an architectural drawing application
A number of folks have asked about the new “radical freedom” flavour of Ubuntu that I hinted at in the announcement of work on Gutsy Gibbon.
Part of that initiative is focused on code freedom – going further than anybody else, though, beyond the CPU down to the level of the code running in firmware on your peripherals. We want to highlight the good work of hardware vendors who have completely embraced that idea. Of course – if you REALLY want freedom then you need to run that flavour on a SUN SPARC chip in an FPGA, in which case you would have freedom to modify even the CPU itself, and everything running on it. Raising the profile of genuinely free hardware is one way I hope we can reach the point where we no longer choose to include any binary drivers in vanilla Ubuntu.
But a broader part of this “radical freedom” thrust is to explore freedom in other domains. If we ship a PDF, do we ship the source document? If we ship a JPG, do we ship the source artwork? If we ship a nicely edited video, do we ship the original, unedited recording so you can really remix it? If we ship music, do we ship the samples and the separated tracks?
Potent medicine indeed. I’m looking forward to seeing how far we can push the concept, just inside the Ubuntu project.
…to Debian on the release of Etch!
There are some ideas that are broken, but attractive enough to some people that they are doomed to be tried again and again.
DRM is one of them.
I was thrilled to see recently that the processing key for *all* HD discs produced to date has been discovered and published. I expect this to lead to the complete unraveling of the Blu-Ray and HD-DVD content protection schemes before even 1% of the potential market for those players has been reached. Good news indeed, because it may inspire the people who setup such schemes to reconsider.
We’ve been here before. The DVD-CSS encryption system was cracked very quickly – stylishly and legally so. Content owners – Hollywood Inc – were outraged and pursued anybody who even referred to the free software which could perform the trivial decryption process. They used the DMCA as a way to extend the laws of copyright well beyond their original intent. They behaved like a deer in the headlights – blinded by the perceived oncoming doom of a world where their content flows quickly and efficiently, unable to see potential routes to safety while those headlights approach. Their market was changing, facing new opportunities and new threats, and they wanted to slow down the pace of change.
Content owners think that DRM can slow down the natural evolution of a marketplace.
In the case of movies, a big driver of DRM adoption was the unwillingness of the industry to get out of the analog era. Movies are typically distributed to theaters on celluloid film, great big reels of it. It costs a lot to print and distribute those films to the cinemas who will display it. So the realities of real-world distribution have come to define the release strategy of most movies. Companies print a certain number of films, and ship those to cinemas in a few countries. When the movie run is finished there, those same films are shipped to new countries. This is why a movie is typically released at different times in different countries. It’s purely a physical constraint on the logistics of moving chunks of celluloid, and has no place in today’s era of instant, global, digital distribution.
Of course, when DVD’s came along, content owners did not want people to buy the DVD in the USA, then ship that to Australia before the film was showing in cinemas there. Hence the brain damage that we call region encoding – the content owners designed DVD-CSS so that it was not only encrypted, but contained a region marker that is supposed to prevent it from being played anywhere other than the market for which it was released. If you live outside the US, and have ever tried to buy a small-run por^W documentary movie from the US you’ll know what I mean by brain damage: it doesn’t play outside the US, and the demand in your region is not sufficient to justify a print run in your region-coding, so sorry for you.
The truth is that survival in any market depends on your ability to keep up with what is possible. The movie owners need to push hard for global digital distribution – that will let them get movies out on cinema globally on the same day (modulo translation), the same way that you and I can see everything on YouTube the day it is uploaded.
The truth is also that, as the landscape changes, different business models come and go in their viability. Those folks who try to impose analog rules on digital content will find themselves on the wrong side of the tidal wave. Sorry for you. It’s necessary to innovate (again, sometimes!) and stay ahead of the curve, perhaps even being willing to cannibalize your own existing business – though to be honest cannibalizing someone else’s is so much more appealing.
Right now the content owners need to be thinking about how they turn this networked world to their advantage, not fight the tide, and also how to restructure the costs inherent in their own businesses to make them more in line with the sorts of revenues that are possible in a totally digital world.
Here are some reality bites:
- Any DRM that involves offline key storage will be broken. It doesn’t matter if that key is mostly stored on protected hardware, either, because sooner or later one of those gets broken too. And if you want your content to be viewable on most PC’s you will have software viewers. They get broken even faster. So, even if you try to protect every single analog pathway (my favourite is the push for encrypted channels between the hifi and the speakers!) someone, somewhere will get raw access to your content. All you are doing is driving up the cost of your infrastructure – I wonder what the cost of all the crypto associated with HD DVD/BluRay is, when you factor in the complexity, the design, and the incremental cost of IP, hardware and software for every single HD-capable device out there.
- The alternative to offline key storage is streaming-only access, and that is equally unprotectable. The classic streaming system, TV broadcast, was hacked when the VCR came out, and that was blessed as fair use. Today we see one of the digital satellite radio companies (Sirius or XM, I think) being sued by content owners for their support of a device which records their CD-quality broadcasts to MP3 players. Web content streaming services that don’t allow you to save the content locally are a very useless form of protection, easily and regularly subverted. And of course not everyone wants to be online when they are watching your content.
- It only takes one crack. For any given piece of content, all it takes is one unprotected copy, and you have to assume that anyone who wants it will get it. Whether it is software off a warez site, or music from an MP3 download service in Russia, or a file sharing system, you cannot plug all the holes. Face it, people either want to pay you for your content, or they don’t, and your best strategy is to make it as easy as possible for people who want to comply with the law to do so. That does not translate into suing grannies and schoolkids, it translates into effective delivery systems that allow everyone to do the right thing, easily.
- Someone will find a business model that doesn’t depend on the old way of thinking, and if it is not you, then they will eat you alive. You will probably sue them, but this will be nothing but a defensive action as the industry reforms around their new business model, without you. And by the industry I don’t mean your competitors – they will likely be in the same hole – but your suppliers and your customers. The distributors of content are the ones at risk here, not the creators or the consumers.
The music industry’s fear of Napster led them down the DRM rabbit-hole. Microsoft, Apple, SONY and others all developed DRM systems and pitched those to the music industry as a “sane ” approach to online music distribution. It was a nice pitch: “All the distribution benefits of download, all the economic benefits of vinyl”, in a nutshell.
Of these contenders, SONY was clearly ruled out because they are a content owner and there’s no way the rest of the industry would pay a technology tax to a competitor (much as Nokia’s Symbian never gained much traction with the other biggies, because it was too tied to Nokia). Microsoft was a non-starter, because they are too obviously powerful and the music industry could see a hostile takeover coming a mile away. But cute, cuddly Apple wouldn’t harm anyone! So iTunes and AAC were roundly and widely embraced, and Apple succeeded in turning the distribution and playing of legal digital music into a virtual monopoly. Apple played a masterful game, and took full advantage of the music industry’s fear.
The joyful irony in this of course is Steve Jobs recent call for the music industry to adopt DRM-free distribution, giving Apple the moral high ground. Very, very nicely played indeed!
A few years back I was in Davos, at the World Economic Forum. It was perhaps 2002 or 2003, a few years after the dot-com bust. It was the early days of the iPaq, everyone at the conference had been loaned one. I remember clearly sitting in on a session that was more or less a CEO confessional, a sort of absolution-by-admission-of-stupidity gig. One by one, some well known figures stood up and told horror stories about how they’d let the inmates run the asylum, and allowed twenty-something year olds to tell them how to spend their shareholder capital on dot-com projects. This was really interesting to me, as I’d spent the dot-com period telling big companies NOT to over-invest, and to focus on improving their relationships with existing customers and partners using the net, not taking over the world overnight.
But the real kicker came at the very end, when the head of SONY USA, also responsible for its music division, Sir Stringer, stood up to make his peace. He gloated on at length about how SONY had NOT invested in the dot-com, and thus how he felt he must be the only person in the room who had not been taken in by the kids. It was a very funny, very witty speech that earned a round of applause and laughter. I was left wondering whether he had any clue whatsoever how many songs would fit on the iPaq in his pocket, or how long it would take to download them. I suspected not. Of all the CEO’s who had spoken that day, I thought he was the one most likely to be hit hard, and soon, by the digital train.
Sir Stringer is now CEO of SONY worldwide. Funny, then, that the SONY PS3 should have been delayed so that work could be completed on its DRM system.
Some bad ideas are just too attractive to die, once and for all.
Well done to the guys working on Bazaar, the distributed version control system, for their work on the latest release. This one includes a new working tree format that radically cut the time of “bzr status” for larger trees. After installing the release I was prompted to type “bzr upgrade” whenever I used a tree in the old format, and the upgrade was smooth (glad for the backup it makes, but I’ve started deleting those since it all seems rock solid).
There’s a page which shows the relative performance of different releases of Bazaar since 0.8, and I’m impressed that they have cut the time of “bzr status” by 2/3rds since 0.8 which I think was about a year ago.
We picked Bazaar for Launchpad because of it’s excellent cross-platform support and robust handling of renames (even in extreme cases – renaming files inside directories that other people renamed and merging frequently between branches of people who are radically restructuring a big tree). It’s never lost data for me, or blown up in a surprising way, and we use it heavily in a team of about 20 developers.
For the past year I’ve been urging the team to focus on performance and these numbers suggest good results. Robert Collins tells me there’s about another 40% of low hanging fruit on “bzr status”, but for 0.16 Martin Pool says the focus is almost entirely on the smart server so that network operations (push to a remote repo, or merge from a remote branch, or commit to a remote branch) are much more efficient, especially for people on high-latency links. Looking forward to it!
Congratulations to the Launchpad team on today’s public beta!
Launchpad is built for Ubuntu, but it’s great to see other projects adopting it too, most notably in recent weeks Zope and Silva. I hope those projects find it much easier to collaborate with one another, and with other projects too.
If you’re curious about Launchpad, this new guide to Launchpad’s feature highlights is a quick read (with pictures ) and gives you a good sense of how best to use it. It covers all the major applications that make up the service – translation, community support, planning, bug tracking, code hosting and team management.
And if you’re keen to test new features and willing to file bugs, you can join the Launchpad Beta Tester’s team, and live on the bleeding edge with nightly builds of the system. We’ll reopen the beta server (beta.launchpad.net) later in the week to test some additional new functionality.
My latest (small) contribution to the code was to help Salgado with the system that lets you set a custom logo for every page in the system that belongs to you or your project. Have fun decorating! Here’s the official announcement of the beta.