It’s Secure

Security is a tough nut to crack, both with respect to making something secure and judging something to be secure. I’m going to call Ubuntu secure, and I suspect that there’s going to be a lot of disagreement here. Nonetheless, allow me to explain why I consider Ubuntu secure.

Let’s first throw out the idea that any desktop OS can be perfectly secure. The weakest component in any system is the user – if they can install software, they can install malware. So while Ubuntu would be extremely secure if the user could not install any software, it would not be very useful to be that way. Ubuntu is just as capable as any other desktop OS out there when it comes to catching malware if the user is dedicated enough. The dancing pigs problem is not solved here.

Nevertheless, Ubuntu is more secure than other OSes (and let’s be frank, we’re talking about Windows) for two reasons. The first is for practical reasons, and the second is for technical reasons.

To completely butcher a metaphor here: if your operating system has vulnerabilities and no one is exploiting them, is it really vulnerable? The logical answer to that is “yes” and yet that’s not quite how things work. Or more simply put: when’s the last time you’ve seen a malware outbreak ravaging the Ubuntu (or any desktop Linux distro) community?

Apple often gets nailed for this logic, and yet I have a hard time disagreeing with it. If no one is trying to break into your computer, then right now, at this moment, it’s secure. The Ubuntu and Mac OS X user bases are so tiny compared to that of Windows that attacking anything but Windows makes very little sense from an attacker’s perspective.

It’s true that they’re soft targets – few machines run anti-virus software and there’s no other malware to fend off – but that does not seem to be driving any kind of significant malware creation for either platform. This goes particularly for Mac OS X, where security researchers have been warning about the complacent nature this creates, but other than a few proof of concept trojan horses, the only time anyone seems to be making a real effort to break into a Mac is to win one.

So I am going to call Ubuntu, with its smaller-yet user base and lack of active threats, practically secure. No one is trying to break into Ubuntu machines, and there’s a number of years’ worth of history with the similar Mac OS X that says it’s not going to change. There just aren’t any credible threats to be worried about right now.

With that said, there are plenty of good technical reasons too for why Ubuntu is secure; while it may be practically secure, it would also be difficult to break into the OS even if you wanted to. Probably the most noteworthy aspect here is that Ubuntu does not ship with any outward facing services or daemons, which means there is nothing listening that can be compromised for facilitating a fully automated remote code execution attack. Windows has historically been compromised many times through these attacks, most recently in October of 2008. Firewalls are intended to prevent these kinds of issues, but there is always someone out there that manages to be completely exposed to the internet anyhow, hence not having any outward facing services in the first place is an excellent design decision.

Less enthusing about Ubuntu’s design choices however is that in part because of the lack of services to expose, the OS does not ship with an enabled firewall. The Linux kernel does have built-in firewall functionality through iptables, but out of the box Ubuntu lets everything in and out. This is similar to how Mac OS X ships, and significantly different from how Windows Vista ships, which blocks all incoming connections by default. Worse yet, Ubuntu doesn’t ship with a GUI to control the firewall either (something Mac OS X does), which necessitates pulling down a 3rd party package or configuring it via CLI.

Operating System Inbound Outbound
Windows Vista All applications blocked, applications can request an open port All applications allowed, complex GUI to allow blocking them
Ubuntu 8.04 All applications allowed, no GUI to change this All applications allowed, no GUI to change this
Mac OS X 10.5 All applications allowed, simple GUI to allow blocking them All applications allowed, no GUI to change this

Now to be fair, even if Ubuntu had shipped with a GUI tool for configuring its firewall I likely would have set it up exactly the same as how I leave Mac OS X set up – all incoming connections allowed – nevertheless I find myself scratching my head. Host-based firewalls aren’t the solution to all that ails computer security, but they’re also good ideas. I would rather see Ubuntu ship like Vista does, with an active firewall blocking incoming connections.

Backwards compatibility, or rather the lack thereof, is also a technical security benefit for Ubuntu. Unlike Windows, which attempts to provide security and still support old software that pre-dates modern security in Windows, Ubuntu does not have any such legacy software to deal with. Since Linux has supported the traditional *nix security model from the get-go, properly built legacy software should not expect free reign of the system when running and hence be a modern vulnerability. This is more an artifact of previous design than a feature, but it bears mentioning as a pillar of total security.

Moving on, there is an interesting element of Ubuntu’s design being more secure, but I hesitate to call it intentional. Earlier I mentioned how an OS that doesn’t let a user install software isn’t very useful, but Ubuntu falls under this umbrella somewhat. Because the OS is based heavily around a package manager and signed packages, it’s not well-geared towards installing software outside of the package manager. Depending on how it’s packaged, many downloaded applications need to be manually assigned an executable flag before they can be run, significantly impairing the ability for a user to blindly click on anything that runs. It’s genuinely hard to run non-packaged software on Ubuntu, and in this case that’s a security benefit – it’s that much harder to coerce a user to run malware, even if the dancing pigs problem isn’t solved.

Rounding out the security underpinnings of Ubuntu, we have the more traditional mechanisms. No-eXecute bit support helps to prevent buffer overflow attacks, and Address Space Layout Randomization makes targeting specific memory addresses harder. The traditional *nix sudo security mechanism keeps software running with user privileges unless specifically authenticated to take on full root abilities, making it functionally similar to UAC on Vista (or rather, the other way around). Finally, Ubuntu comes with the AppArmor and SELinux security policy features that enable further locking down the OS, although these are generally overkill for home use.

There’s one last issue I’d like to touch on when it comes to technical security measures, and that’s the nature of open source software. There is a well-reasoned argument that open source software is more secure because it allows for anyone to check the source code for security vulnerabilities and to fix them. Conversely, being able to see the source code means that such vulnerabilities cannot be completely obscured from public view.

It’s not a settled debate, nor do I intend to settle it, but it bears mentioning. Looking through the list of updates on a fresh Ubuntu install and the CERT vulnerability list, there are a number of potential vulnerabilities in various programs included with Ubuntu – Firefox for example has been patched for vulnerabilities seven times now. There are enough vulnerabilities that I don’t believe just counting them is a good way to decide if Ubuntu being open source has a significant impact on improving its security. Plus this comes full-circle with the notion of Ubuntu being practically secure (are there more vulnerabilities that people aren’t bothering to look for?), but nevertheless it’s my belief that being open source is a security benefit for Ubuntu here, even if I can’t completely prove it.

Because of the aforementioned ability to see and modify any and every bit of code in Ubuntu and its applications, Ubuntu also gains a security advantage in that it’s possible for users to manually patch flaws immediately (assuming they know how) and that with that ability Ubuntu security updates are pushed out just about as rapidly as humanly possible. This is a significant distinction from Windows and Patch Tuesday, and while Microsoft has a good business reason for doing this (IT admins would rather get all their patches at once, rather than testing new patches constantly) it’s not good technical reasoning. Ubuntu is more secure than Windows through the virtue of patching most vulnerabilities sooner than Windows.

Finally, looking at Ubuntu there are certainly areas for improvement with security. I’ve already touched on the firewall abilities, but sandboxing is the other notable weakness here. Windows has seen a lot of work put into sandboxing Internet Explorer so that machines cannot get hit with drive-by malware downloads, and it has proven to be effective. Both Internet Explorer and Google’s Chrome implement sandboxes using different methods, with similar results. Meanwhile Chrome is not ready for Linux, and Firefox lacks sandboxing abilities. Given the importance of the browser in certain kinds of malware infections, Ubuntu would benefit greatly from having Firefox sandboxed, even if no one is specifically targeting Ubuntu right now.

It’s Free – Libre Ubuntu – Long Term Support
Comments Locked

195 Comments

View All Comments

  • Eeqmcsq - Wednesday, August 26, 2009 - link

    for your time spent on writing this article. I've made the jump to from Windows to Ubuntu (and Xubuntu for my older computers) back around 7.10 and 8.04 and I went through some of the headaches in adjusting to Ubuntu, but I eventually solved all of them and I'm quite settled in now.

    One comment about finding help in the form of command line instructions, rather than GUI instructions. GUI instructions for Ubuntu would not be useful for Kubuntu or Xubuntu, since they use different window managers. The command line solutions usually work for all three.

    Also, boot times were noticeably improved in 9.04. Perhaps you can run a quick retest on it.

    And you CAN install stuff when using the live CD. I've installed a couple of temperature monitoring utilities when I was stress testing my motherboard.

    Finally, thanks again for writing such a thorough look into your Ubuntu experiences. It was a great read in seeing how far Ubuntu has come and what it still lacks.
  • fepple - Thursday, August 27, 2009 - link

    Yeah, you can set the APT sources to use a CD. There is an option for it 'system' > 'administor' > 'software source', or you can edit the /etc/apt/sources.list file
  • clarkn0va - Wednesday, August 26, 2009 - link

    [quote]since SMB is the predominant protocol for consumer file server gear, it’s a fair test of such use.[/quote]

    While this comment is not false, it presents a lazy approach to comparison; it's a one-sided contest, and Linux, pitted against Windows on home turf, doesn't stand much of a chance.

    You as much as acknowledge this in the article, so why not provide some counterpoint? For example, consumer file server gear, even if it supports SMB almost ubiquitously, is usually *nix-based. So instead of just showing Windows and Linux clients interacting with Windows servers, show them interacting with *nix servers as well. Do some NFS transfers as well; NFS is well supported in consumer NAS these days.

    You also really missed the boat on the video drivers. 8.04 was not the first Ubuntu release to include the Restricted Drivers Manager (known simply as "Hardware Drivers" in later releases). This handy app will identify hardware, such as AMD and NVIDIA GPUs, that can take advantage of proprietary drivers, and will offer to to install the same via synaptic (APT) with just a click of the mouse. No CLI, no headaches.

    Still, a thorough review, and generally well-researched. I'm looking forward to the 9.04 follow-up.

    Since you mentioned hardware HD decoding, I recommend taking a look at smplayer from the testing ppa (https://launchpad.net/~rvm/+archive/testing)">https://launchpad.net/~rvm/+archive/testing). Unfortunately vdpau doesn't work with the nvidia blobs in the default Ubuntu repos, but I believe there's a PPA providing vdpau-compatible blobs for anybody not wanting to do CLI installs.

    db
  • VaultDweller - Wednesday, August 26, 2009 - link

    [quote]While this comment is not false, it presents a lazy approach to comparison; it's a one-sided contest, and Linux, pitted against Windows on home turf, doesn't stand much of a chance. [/quote]

    This isn't Linux pitted against Windows on home turf, it's Linux pitted against Windows in the real world.
  • clarkn0va - Wednesday, August 26, 2009 - link

    Well, no doubt SMB is the dominant method of sharing files for consumers in general. Obviously comparing Linux to Windows makes sense in a world where Windows is the incumbent, but it's not the whole story.

    I hope Part 2 will address some of the objective benefits of Ubuntu, and not fall into the trap of "worse because it's not the same as Windows".
  • VaultDweller - Wednesday, August 26, 2009 - link

    I agree in principle, but there has to be a distinction between "Worse because it's not compatible with Windows," "Worse because it's not as easy as Windows," and "Worse because it's not the same as Windows." Die-hard *nix advocates tend to dismiss the first two as if they were the latter, and this tends to undermine their argument.

    Also, in some cases "Worse because it's not the same as Windows" can be a valid point, because the public has been trained to the point that the Windows way is the "intuitive" way. Of course, this isn't truly intuitive, as people who learned Linux first would find Linux methodologies more intuitive - but that's largely a moot point, as that's not the reality we live in today. You could say the same thing about the color red - in the western world, when we see red we can intuitively guess that it means Stop, or Warning, or Error, etc. The fact that this is not an understanding we're born with but rather a socially acquired intuition does not mean it would be any easier to suddenly change the color of traffic lights and expect people to adjust without problems.
  • Ryan Smith - Wednesday, August 26, 2009 - link

    All of the NAS gear I can get my hands on is either SMB only, or is a Time Capsule which is SMB + AFP. I don't have anything that does NFS, which isn't so much a comment on testing (I could always set up another box) as it is usefulness. NFS just isn't common on consumer gear; SMB is a more important metric if you're looking at file transfer performance, because that's what most people are going to be working with. This doesn't preclude doing NFS at a later time though.

    And the Restricted Drivers Manager is limited to the drivers in the Hardy repository, which means they're a year+ out of date.
  • amrs - Wednesday, September 30, 2009 - link

    Interestingly, if one checks the SmallNetBuilder NAS charts, it looks like out of 87 NAS devices, 49 have NFS. 56% in other words. And you say NFS isn't common? Really now? Seems a little biased to me.
  • ekul - Wednesday, August 26, 2009 - link

    While a lot of your issues have complicated solutions or lengthy technical backstories I can solve your complaint of smb shares mounted in nautilus not being useful in non-gtk applications in one simple command (or as you seem to hate commands the gui can do it too).

    theory: make a symlink to the directory nautilus mounts to so it can be easily accessed. Symlinks to directories or files are transparently (to users and applications) identical to the location they refer to. Windows doesn't have symlinks (only useless shortcuts) so it isn't surprising you were not aware to do it.

    howto: gvfs uses the directory /home/$USER/.gvfs as a mount point so link to it:
    ln -s ~/.gvfs ~/linkname

    howto gui: in nautilus go to your home folder then choose view -> show hidden files. Right click on .gvfs and choose make link. Then you can rename the link to whatever you want and hide hidden files again.

    hint: symlinks are your best friend. My home dir is littered with links to places on the filesystem I visit a lot to avoid a lot of clicking/typing
  • Ryan Smith - Wednesday, August 26, 2009 - link

    I suddenly feel very humiliated...

    The symlink is a very elegant solution, I'm embarrassed I didn't think of that myself. It's a bit of a lousy solution in that there even needs to be a solution, but as far as things go that's a very insightful suggestion.

Log in

Don't have an account? Sign up now