Layered Distribution

10 thoughts
last posted Sept. 8, 2012, 7:16 a.m.
0
get stream as: markdown or atom
0

There's a repeated pattern in software distribution: a vendor (or other group) creates a platform that supports modular additions. They provide (or someone creates) a distribution mechanism for that platform.

Then a particular component gets so complicated, they decide they need to break it up a bit. There then comes a key decision point: do you use the existing distribution mechanism for the parent platform? Or do you create your own?

To complicate matters: what do you do when your meta-platform runs on multiple parent platforms?

0

If we start from the operating system level, an OS has two major plugin frameworks: user space applications and hardware drivers.

There are also two major levels of compatibility: Source level (aka API) compatibility, which only benefits users sophisticated enough to build their own binaries, and binary (aka ABI) compatibility, which is necessary to support end users without software build capabilities.

0

Linux mostly focuses on API compatibility. For drivers, this is due to the lack of a stable kernel ABI, while for user space, it's due to cultural rather than technical factors. From the outside, most distros appear to care more about critiquing the internal details of the vendor's software design than they do about delivering working software from vendors to end users.

There are couple of notable exceptions to this: Android and Ubuntu. Android is a Linux distro that expects applications to be self-contained bundles independent of everything else. Ubuntu (via the Software Centre) is a distro that focuses more on getting working software from vendors to end users, and leaves it up to the vendors and end users to decide whether or not the four freedoms are important for their use case. So long as your stuff can install cleanly and doesn't interfere with the rest of the system, Canonical doesn't care if your application contains a copy of a library that is also used to implement parts of the underlying operating system and platform utilities.

0

Windows took the point of view of providing ABIs for both drivers and user space. This has resulted in some pretty awful growth in the Windows APIs, and meant MS took the blame for stability problems caused by buggy third party drivers.

However, what they gained from this is that when a piece of hardware doesn't work with Windows, most end users blame the hardware vendor, not MS.

Part of this was the OEM manipulation that the US DoJ eventually slapped them down for, but the after effect still remains in play: hardware vendors take on most of the obligation of making their stuff work with Windows, and MS make it as easy as they can for hardware vendors to provide binary drivers directly to end users.

0

On the hardware front, Apple go to the other extreme: there is no modular API for direct hardware interfaces. They own the full stack, from the external ports, all the way up to the UI, and, in the case of the dock connector on portable devices, even that is proprietary.

Their software delivery mechanisms are pretty locked down as well (almost completely so on iOS), but they keep hold of the developers by permitting the ability to "roll your own" when you really need to (hence the popularity amongst developers of systems like homebrew)

0

Where it gets interesting is when we start looking at the cross-platform languages like Python, Perl, Ruby and Java.

Java goes to the extreme of "bundle all your dependencies". This is wonderful for portability, but makes rebasing a nightmare.

Python/Perl/Ruby et al (and I believe credit really goes to Perl here for blazing the trail with CPAN) all instead adopt the model of a central repo focused on that particular language. This is great when you only need to deal with a single language, but sucks once you have to start dealing with multiple languages. This is why we now see people resorting to bundling their Javascript dependencies - the language specific tools have no mechanism for dealing with a parallel dependency hierarchy between Javascript libraries.

0

The reaction from many Linux distro folks to the idea of bundled dependencies is to simply stop listening while muttering "but, but, but, security!".

In a world where every dependency always maintained perfect backwards compatibility, they'd have a point. As it is, they still have a point, it's just not the only point worth considering. Every developer of complex applications knows two things:

  • even projects with strong backwards compatibility policies will occasionally introduce regressions in new releases (because no test suite is comprehensive)
  • many projects that provide essential functionality don't even have a strong backwards compatibility policy in the first place

These two points add up to a solid conclusion: rebasing dependencies is risky, and should be preceded by a testing period by the application developer.

When a distro institutes a "no bundling" policy, they are directly interfering with the developer's architectural decisions. If you happen to choose a dependency that is also used by the distro for their own tools, then the distro wants to take over your rebasing decisions. This is hostile to both application developers (who get criticised for completely sensible architectural decisions) and to end users (who get applications that break unexpectedly, or that suffer indefinite delays in updates because there is a third party packaging process involved in the delivery of updates).

0

This all means that there will never be "one packaging system to rule them all". From a social perspective, there needs to be recognition that dependency bundling isn't inherently evil, it merely shifts the responsibility for providing prompt security updates when vulnerabilities are found in those dependencies to the application vendor.

From a technical perspective, packaging systems would ideally be designed such that they can play host to other packaging systems and also be hosted on other packaging systems. Much of the current effort in Python packaging relates to moving to a system that isn't driven by a Python script, but instead by static configuration data. This static data should then be more amenable to automatic translation into other packaging formats.

0

There are also multiple efforts to change the ways Linux distros themselves are structured to help differentiate user-focused applications like Firefox or Chrome from the core OS, as well as making it easier to have per-user configurations of platforms like Python. Way too many things that could be handled completely at the per-user level (as they are on Android) currently require root access to the whole system.

Some interesting projects in this space (at varying degrees of maturity and popular uptake):

0

There's also another highly salient point in all this. We've now had a natural experiment running for more than two decades in regards to the relative effectiveness of different distribution models for getting software from vendors to end users.

Users have proven they're willing to adopt new operatings systems: for example, Mac OS -> Windows -> Mac OS X, as well as the rise of iOS and Android. So "people just can't handle change" doesn't explain the failure of the Linux Desktop to become widespread.

Users have also proven they're willing to tolerate limited hardware compatibility, with the rise of Apple. Sure, Apple's superior design skills makes it easier to tolerate the fact that there is no choice in hardware vendor, but it's enough to show that broad hardware compatibility is a negotiable item.

There's one feature that all of the mass market operating systems (including the game consoles) have in common, though: an easy mechanism for application vendors to provide prebuilt applications with bundled dependencies (except for the core OS services) to end users.

So what do I see as the core difference between Ubuntu and other Linux distributions? Canonical have made a conscious decision to tolerate binary hardware drivers, and to allow the provision of binary applications with bundled dependencies through the Ubuntu Software Centre.

Many people see this as a betrayal of the principles that underpin free and open source software. In a certain sense, that's true: Canonical have deliberately placed the goal of providing a compelling user experience ahead of the goal of promoting the cause of free and open source software. They're not preaching to their end users or their vendors, they're leaving that discussion to others.

But that's a topic for a different thoughtstream (We're Only Part Time Tinkerers)