On package creation and maintenance

I’ve recently started to wonder how the main Linux distributions handled the packages available in their repositories.

There have been some criticisms regarding the packages available in Ubuntu for instance. People always complain the version of their favorite software always lags behind the official releases. The Tor project itself advises not to use the package available in Ubuntu repositories, and WebUpd8 recently reported the same kind of problem for OwnCloud in Ubuntu.

I tried to look for more information on the Ubuntu wiki, as well as for other distribs (like Arch Linux), but I don’t really understand how the packaging and package maintenance work. In my mind, it’s a very timely process that cannot really be fully automatized (you need to be aware of new releases of the softwares to maintain, so for that I guess you need to read the dev mailing lists or something like that), and there are thousands of softwares to maintain (for instance, the Arch Linux packages page lists 11609 packages for only 61 maintainers… that’s more than 190 packages to maintain per person!).

So… how does all this work? Is there a way to improve the process in the future, so that more updated packages are present by default in popular Linux distros?

And could projects like Linuxbrew help in that regard, or would it just make the whole thing even more complicated in the end?

Instead of explaining why packages can’t be removed I really didn’t like corporate like tone of Marc. He sounded really rude and more in mood of here is what it is, here is what I want you to do, and we are sorry for the users.

I can provide some input here, at least from the Ubuntu/Debian side, but I think it maps to other distros too.

The major issue is that .deb and .rpm packages run as root (remember the infamous Shuttleworth comment about how we have root, this is what he is referring to).

When you create a .deb there are a set of maintainer scripts, and they run as root. These are scripts that are run pre and post installation that can modify your system. When you create these packages you want to ensure they don’t do anything naughty, and in a collaborative community where anyone can conceivably become a package maintainer, you have to have a very high bar before you give anyone direct access to upload packages. As such, becoming a core-dev or motu in Ubuntu or a dd in Debian requires a long and comprehensive journey.

So, from the outset, these packages have a fairly small number of people maintaining them and there is always far more work than people, particularly as many packages require very specialist experience - you don’t want someone who knows GNOME going around packaging the kernel for example.

The next issue here is policy, and this points to why you get old packages in the archive. When a new version of Ubuntu or Debian ship the archive is frozen. This means that no more updates or changes can go in to ensure the archive is as stable as possible.

So, as an example, imagine FooPackage version 1.0 goes into Ubuntu ready for the 14.10 release, at this point we are all good…you can get the latest and greatest FooPackage with a single click.

From that point onwards the archive is subject to the Stable Release Updates (SRU) policy. This basically means that the only updates to that package that can go in are security and vulnerability related. If FooPackage 1.0 turns out to have a major security issue, the package can be replaced with a newer version, but in general only the bugfix will go in. Features are not allowed in, so that fancy 1.2 version won’t get in…you have to wait until Ubuntu 15.04 for that.

Thus, what happens is you get an archive that has generally very stable software but gets old quickly. This is one of the reasons why Ubuntu focused on the six-month release cycle - you only have old shit for six months and then you get a fresh batch. The problem is, since that model was invented the app store became and thing and now people want stable updates more often.

Ubuntu tried to alleviate this with PPAs in which you can run your own mini-archive, but these should be trusted about as much as downloading a random .exe off the Internet for Windows - someone could include a package that has a nasty maintainer script that hoses your system.

As such, app authors often ship their own packages with newer versions and don’t recommend the in-archive versions. This is more common with desktop software as older server software is often preferred as it is more stable.

The solution to this in Ubuntu was the creation of .click packages. We spent time building a complete security layer in Ubuntu in which anyone can ship a click package that is insulated by that security layer and can’t do naughty things, but they can update packages whenever you want. This significantly increases the speed of getting new software to users. The problem is, at least today, this is only focused on apps written for Ubuntu with the Ubuntu SDK and doesn’t currently cover non-Ubuntu SDK software. I know click will be extended to cover this when Ubuntu fully converges on the desktop, but this is a lot of work.

Hope this helps explain things.

3 Likes

Thanks a lot, Jono, it’s very clear and I’ve learnt quite a few things!

So the maintainers check the new releases of each package they have to maintain and they decide if a new release is a bugfix/security fix (in that case they will import it) or if it’s a new-feature-only release (in that case they won’t import it), is that correct? I suppose it does happen that a software is updated with both new features and bugfixes… in that case, what do the maintainers decide?

Yes, I remember my early days in engineering school when Debian was the shit (still is! it’s just that back in 2002 Ubuntu didn’t exist ;)) and it was revered for its stability… but of course, the first time I logged into one of the computers at school, I was like… errrr… wait, why are they using a Filezilla version from last year?

This sounds like a great idea, I just hope it won’t end like XKCD #927 :wink:
You say .click won’t require root. Does it mean they will install in $HOME?
In the blueprint page I read:

click-package is a simple way to package simple apps and is not designed to replace .debs

What does “simple apps” mean? Is a software such as OwnCloud a simple app? What about Gimp or RawTherapee?

Anyway, thanks again for these details, Jono!

Again question is why software with known vulnerabilities, which won’t be updated, simply removed? I see two option 1) patch vulnerabilities 2) remove package. But here Marc has chosen none of the above and we will ship vulnerable package against the wish of authors of the application.

Who is marc? I don’t see him mentioned anywhere in this thread?

Wondering exactly the same here… :slight_smile: Who’s that Marc?!

Ditto. Who is this Marc you speak of?

Marc Deslauriers Ubuntu Security Engineer. See the mailing list reply [1]. Looks like Jonathan Riddell has come to rescue[2].

Ubuntu carries outdated and vulnerable packages but refuses to remove them and asks either the author or community member does the work. Why not simply remove the packages that you can’t actively maintain specially when upstream is providing packages for all major distributions?

[1] https://lists.ubuntu.com/archives/ubuntu-devel/2014-October/038516.html
[2] https://bugs.launchpad.net/ubuntu/+source/owncloud/+bug/1384355

I know Marc well, he is a great guy.

As he says pretty clearly in his post, you can’t remove packages from a release pocket. If you do, you break people’s systems. As such, you have to issue a newer package that will trigger an upgrade.

With SRUs there is rarely full new versions landed, instead specific patchsets are instead uploaded to fix issues. I will ask some members of the Ubuntu security team to see if they can share some further insight here.

Yeah, Debian approached this by having the stable, testing, unstable set of repositories, which works pretty well - you can run stable on servers, but most people on desktops would run testing or unstable. Ubuntu tried to take a simpler approach.

Well this is not intended to be a standard…it is designed to serve Ubuntu’s needs pretty directly. It is of course Open Source though, so other distros could use it (with suitable adjustments for their distro).

With click the key point is that you have a security layer that protects the system, and all apps are required to indicate what policy they want.

It doesn’t run out of $HOME, it runs out of /opt.

When I say simple apps, I mean apps written specifically for the Ubuntu SDK. As such, all those apps run within the parameters of what .click can deliver.

For apps that don’t use the Ubuntu SDK, or other software such as daemons, this is more complex, and .click doesn’t currently support it.

I am not sure what the current plans are for non-Ubuntu apps - either the team will modify .click to support those apps, or those apps may continue to live in main/universe (and thus still have the issues you originally highlighted).

1 Like

I apologize for sounding rude in that post, it wasn’t my intention. I simply wanted to explain what the possible alternatives were.

As a matter of policy, we can’t simply remove packages from the archive. Once a version of Ubuntu has been released, the release pocket containing the packages it shipped with becomes set in stone. We never regenerate the package list or modify it in any way. If we did that, we would likely break installation scripts and preseeds that people may be using. It also makes tracking changes harder.

For the -updates and -security pockets, I guess we could in theory remove
packages, but I don’t think we’ve ever done that before, as that would result in
users having packages installed that no longer exist in the repository. If we simply remove the package, then all OwnCloud users would have remained vulnerable with the insecure version installed. Removing a package will not remove it from deployed systems.

Simply removing the package would have left our users vulnerable, and that wouldn’t have been an acceptable solution.

1 Like

If we simply remove the packages, our users would have remained vulnerable, so that’s not really an option. I then proposed three alternatives on how to fix this issue.

One thing that I would like to make clear is that the OwnCloud package is in the Universe repository, which means it gets updates from volunteers in the Ubuntu community. While I do care for certain packages in Universe that are important to me in my spare time, I am not an OwnCloud user, so am not really capable of taking over maintaining a package that I know nothing about or don’t use.

I asked for volunteers willing to invest time in solving this problem, and someone did step up.

1 Like

I did get the package removed in time before Ubuntu 14.10 got released. For previous Ubuntu versions, as I’ve mentioned in my other replies, removing the packages would have left our users vulnerable to the security issues.

1 Like

There are a few reasons why Ubuntu (and a lot of other distros) do this:

1- New versions may require updated libraries

If foo version 1.0 requires a library called libxyz 0.1, but then new foo version 1.2 requires libxyz 0.2, it becomes difficult to simply update foo packages to the new version without pulling in a whole bunch of newer libraries. And then you hit the issue that some other package in the archive may be incompatible with libxyz 0.2, etc.

Even if the libraries are versioned properly, just to get a point update on that foo package, you now have to test and perform quality assurance on all the other packages you’ve updated along with it.

2- New version may be not be backwards compatible

Newer upstream versions may break backwards compatibility. For example, switching PHP 5.5 to PHP 5.6 may cause your PHP application to break. For production deployments, this would be unacceptable. What would end up happening is that system adminstrators would stop updating their packages and would no longer get security updates.

3- New versions may introduce new bugs

There are a lot of cases where a new upstream version will not only fix 10 bugs, but the new features will introduce 10 new different bugs. At some point, always upgrading to the latest version may become an irritant to users.

Of course, this depends on the software and how much quality assurance the upstream developer invests in it.

4- New version require a lot more testing

As a security engineer, it’s a lot easier for me to test a two-line security patch to software that already shipped then it is to test a whole new version with a bunch of new features. Testing a new version is a lot more difficult than simply backporting small patches.

4 Likes

Considering all the possible trouble mentioned here (and in many other places), I am always fascinated that all of my “unstable” / “alpha-to-nightly” systems keep working all the time… I’m using solely Gentoo and Arch for my private needs. The funny thing is, the systems I did encounter breakages on were Ubuntu, openSUSE and Debian. And the reasons were always that they had outdated packages (especially libs), but the software I need requires newer versions. So eventually I ended up mixing repos and getting stuck at some point. -.- That is so damn weird… and my personal reason to stay far away from “stable” systems.

Gentoo’s portage on the other hand is so smart in terms of resolving package dependencies that I can in fact just sit back, relax, let it run and enjoy my working system. I’ve never had to reinstall it ever since I started in early 2010. I wish apt and zypper were capable of detecting issues instead of just hinting the user that “something might break, are you sure…?” That always drives me nuts since I never really know and the tools aren’t helping very much. ;D

On Arch I just have a working system out of the box and all new packages available very quickly. And again no issues, even the packages from the user repo (AUR) always build and work fine.

1 Like

Basically, all distros have indirectly forced stable packages to be buggy. We simply don’t expect stable to be stable and I wonder if upstream cares about producing stable anymore. If a few things slip by who cares Debian will anyways drag if for next 5 years.

Package management is seriously broken on Linux. There is so much waste of time of good people that could be spent on development.

1 Like

How on earth have you reached this conclusion?

It is not broken at all. It works very, very well. The major difficulty is that most upstreams don’t care about how their software is packaged (they expect distros to do this) and thus you find fewer people in the distros who can pick up the slack.

If every upstream had someone who cares about the package on a given platform, this problem would go away.

This issue is not package management on Linux, it is an issue with people.

I expect stable to be stable. Anyone who maintains any sort of server wants things to be stable. Increasingly, I only use LTC releases of Ubuntu as my primary desktop as the things I need either:

  • Don’t get many updates/releases (i.e vim, pidgin, etc)
  • Get updates at an acceptable pace (i.e browsers such as Firefox, Chrome are updated frequently)

Any security updates I get are good, but the less stable my system is the less I can absolutely rely on it all the time. I’m onto 14.04 now on my main workstation and 12.04 at home, neither give me any trouble, and have never crashed.

The main place where I require latest versions of software is for development on Python projects where I need/want the latest and greatest of a specific package, but python package/library management shouldn’t be done through the linux package manager but through an actual python package manager like pip. Using technology like virtualenvs allows me have the latest and greatest of these without needing a “bleeding edge” distro elsewhere.

Sorry, but I cannot fully agree here. The package manager is responsible for keeping your filesystem clean in a way, and should not be interfered by other package managers. I suggest interfaces offered by the “main” package managers (i.e., dpkg, pacman, zypper etc.) to secondary package managers (lke pip, npm, whatever) so that they won’t put files where the distro says they shouldn’t belong. But again, as @jonobacon already pointed out, this is a people problem more than a packaging problem.