Comment: Fragments of win
by Dj Walker-Morgan
Fragmentation is this month's word of the day, whether it be related to Canonical's plan to develop and launch its own Mir display server fragmenting a consensus around Wayland or to Miguel de Icaza's tale of his journey away from a fragmented desktop Linux world. But if we step back and look at the bigger picture, fragmentation isn't just a part of the Linux story, it is in many ways core to its power to bring free software to the world.
The Linux ecosystem will never, as a whole, be full of success stories. Any ecosystem that supports so many different projects and which offers developers the ability, for low-to-no incidental cost, to produce software to address even the smallest niche audience is going to be, in general, full of projects that may have succeeded or failed on their own terms, but were never in any danger of becoming what the wider world would call successful. The real conundrum of Linux is that fragments of that ecosystem do become successful, though ironically, they tend to do so by becoming more distinctive fragments.
The most successful fragment of Linux, by far, is Android, which took the Linux kernel, and little else, from the ecosystem and built on top of it a phone and tablet operating system. Over the past few years a process of re-integrating the changes made in the Android kernel and making new changes to accommodate it has taken place. The work done on Android has also offered a leg up to both Ubuntu on phones and Firefox OS during development.
Another fragment of Linux that has become dominant is in the realm of supercomputers and other high performance computers. This is another arena where Linux is stripped down, with optimised kernels used in generally headless systems in fast networks of hundreds of similar or identical systems. It has faced challenges and created solutions to managing large numbers of systems, optimising network connections, and handling tasks in parallel.
If you consider both of these successful fragments, the question is how have they been successful and what can the rest of the Linux ecosystem do to replicate that success. Both have focused on delivering what they needed to deliver to be part of a bigger task. That task could be upending the mobile phone business by bringing an open source operating system to as many phone makers as possible or it could be working out what tomorrow's weather is going to be as quickly as possible. The important part is that by focusing on the larger task, Linux has been bent to the needs of the task and the Linux kernel's licensing has served to ensure that, no matter how far it is bent, the result of that bending is available for all.
Which brings us back to what many people are sitting in front of... Linux on the desktop. It's easy to specify a target for desktop Linux of providing a useful environment for developers to hack code and their own systems. Pretty much all Linux systems aimed at the desktop PC fulfil that requirement, but they can take a hundred routes to get there, from wrapping every component so the user can just drop it on a system to dropping a bag of digital tools on the user's desktop and handing them instructions on how to build a Linux. There's a difference in packaging, there are selections of desktops, there's variation in admin tools, but all in all the overall structure has remained fairly shared: Linux kernel, GNU tools and compilers, X.org display server, a touch of Samba, an Apache web server... the common consensus system. Each distribution a light fragmentation from the others, each one serving a particular developer or group of developers what they feel they need in a distribution.
What has tended to be missing is a distinctive fragment, one that truly stands out and runs by its own rule book. The nearest to being that has been Fedora, though its distinction lies in it pioneering the evolution of Linux as a whole. But the "aggregate consensus" Linux is a thing that a lot of people like: they choose a distribution of it that suits their needs, is the right size, has the right philosophy driving development, and is a good fit for them. And then they tend to stick with it and get involved.
One Linux distribution, which began life as an "aggregate consensus" Linux wants to be that distinctive fragment. That's Ubuntu. Given a target of running on four different form factors by its leader and aggressively setting out to achieve that, it has brought upon itself a fair share of criticism. Much of that concerns how badly the company has, despite very visible community management, handled the transition from being an "aggregate consensus" Linux to what Mark Shuttleworth envisages.
But that is a side issue. Canonical can drive Ubuntu anywhere it likes in terms of functionality, usability and vision. Right now, it is setting out on a course that will put clear orange/brown water between it and other desktop Linux systems. That route has a roadmap that does seem to some to have a leap over the Grand Canyon as a feature, with the company needing to land four different form factors within six months of each other. But that is Canonical's choice and challenge. If it makes it and gets commercial adoption as a result of creating its unique selling proposition, then there's a good chance it will be rich with code under free software licences that can, over time, be co-opted back into the mainstream Linux fold. If not, then anything that's good can be rescued from the canyon river for another day.
But, it's not in anyone's purview to say that they should not take their chosen path. If people have issues with that path, then they have two options: one, to find another distribution which fits them better, the other, to fork Ubuntu from a point where it was still a good match for them and maintain that fork.
What you can't do is tell the people not to fragment Linux. Without that power to explore the possible, Linux's DNA is diminished.