It’s Secure

Security is a tough nut to crack, both with respect to making something secure and judging something to be secure. I’m going to call Ubuntu secure, and I suspect that there’s going to be a lot of disagreement here. Nonetheless, allow me to explain why I consider Ubuntu secure.

Let’s first throw out the idea that any desktop OS can be perfectly secure. The weakest component in any system is the user – if they can install software, they can install malware. So while Ubuntu would be extremely secure if the user could not install any software, it would not be very useful to be that way. Ubuntu is just as capable as any other desktop OS out there when it comes to catching malware if the user is dedicated enough. The dancing pigs problem is not solved here.

Nevertheless, Ubuntu is more secure than other OSes (and let’s be frank, we’re talking about Windows) for two reasons. The first is for practical reasons, and the second is for technical reasons.

To completely butcher a metaphor here: if your operating system has vulnerabilities and no one is exploiting them, is it really vulnerable? The logical answer to that is “yes” and yet that’s not quite how things work. Or more simply put: when’s the last time you’ve seen a malware outbreak ravaging the Ubuntu (or any desktop Linux distro) community?

Apple often gets nailed for this logic, and yet I have a hard time disagreeing with it. If no one is trying to break into your computer, then right now, at this moment, it’s secure. The Ubuntu and Mac OS X user bases are so tiny compared to that of Windows that attacking anything but Windows makes very little sense from an attacker’s perspective.

It’s true that they’re soft targets – few machines run anti-virus software and there’s no other malware to fend off – but that does not seem to be driving any kind of significant malware creation for either platform. This goes particularly for Mac OS X, where security researchers have been warning about the complacent nature this creates, but other than a few proof of concept trojan horses, the only time anyone seems to be making a real effort to break into a Mac is to win one.

So I am going to call Ubuntu, with its smaller-yet user base and lack of active threats, practically secure. No one is trying to break into Ubuntu machines, and there’s a number of years’ worth of history with the similar Mac OS X that says it’s not going to change. There just aren’t any credible threats to be worried about right now.

With that said, there are plenty of good technical reasons too for why Ubuntu is secure; while it may be practically secure, it would also be difficult to break into the OS even if you wanted to. Probably the most noteworthy aspect here is that Ubuntu does not ship with any outward facing services or daemons, which means there is nothing listening that can be compromised for facilitating a fully automated remote code execution attack. Windows has historically been compromised many times through these attacks, most recently in October of 2008. Firewalls are intended to prevent these kinds of issues, but there is always someone out there that manages to be completely exposed to the internet anyhow, hence not having any outward facing services in the first place is an excellent design decision.

Less enthusing about Ubuntu’s design choices however is that in part because of the lack of services to expose, the OS does not ship with an enabled firewall. The Linux kernel does have built-in firewall functionality through iptables, but out of the box Ubuntu lets everything in and out. This is similar to how Mac OS X ships, and significantly different from how Windows Vista ships, which blocks all incoming connections by default. Worse yet, Ubuntu doesn’t ship with a GUI to control the firewall either (something Mac OS X does), which necessitates pulling down a 3rd party package or configuring it via CLI.

Operating System Inbound Outbound
Windows Vista All applications blocked, applications can request an open port All applications allowed, complex GUI to allow blocking them
Ubuntu 8.04 All applications allowed, no GUI to change this All applications allowed, no GUI to change this
Mac OS X 10.5 All applications allowed, simple GUI to allow blocking them All applications allowed, no GUI to change this

Now to be fair, even if Ubuntu had shipped with a GUI tool for configuring its firewall I likely would have set it up exactly the same as how I leave Mac OS X set up – all incoming connections allowed – nevertheless I find myself scratching my head. Host-based firewalls aren’t the solution to all that ails computer security, but they’re also good ideas. I would rather see Ubuntu ship like Vista does, with an active firewall blocking incoming connections.

Backwards compatibility, or rather the lack thereof, is also a technical security benefit for Ubuntu. Unlike Windows, which attempts to provide security and still support old software that pre-dates modern security in Windows, Ubuntu does not have any such legacy software to deal with. Since Linux has supported the traditional *nix security model from the get-go, properly built legacy software should not expect free reign of the system when running and hence be a modern vulnerability. This is more an artifact of previous design than a feature, but it bears mentioning as a pillar of total security.

Moving on, there is an interesting element of Ubuntu’s design being more secure, but I hesitate to call it intentional. Earlier I mentioned how an OS that doesn’t let a user install software isn’t very useful, but Ubuntu falls under this umbrella somewhat. Because the OS is based heavily around a package manager and signed packages, it’s not well-geared towards installing software outside of the package manager. Depending on how it’s packaged, many downloaded applications need to be manually assigned an executable flag before they can be run, significantly impairing the ability for a user to blindly click on anything that runs. It’s genuinely hard to run non-packaged software on Ubuntu, and in this case that’s a security benefit – it’s that much harder to coerce a user to run malware, even if the dancing pigs problem isn’t solved.

Rounding out the security underpinnings of Ubuntu, we have the more traditional mechanisms. No-eXecute bit support helps to prevent buffer overflow attacks, and Address Space Layout Randomization makes targeting specific memory addresses harder. The traditional *nix sudo security mechanism keeps software running with user privileges unless specifically authenticated to take on full root abilities, making it functionally similar to UAC on Vista (or rather, the other way around). Finally, Ubuntu comes with the AppArmor and SELinux security policy features that enable further locking down the OS, although these are generally overkill for home use.

There’s one last issue I’d like to touch on when it comes to technical security measures, and that’s the nature of open source software. There is a well-reasoned argument that open source software is more secure because it allows for anyone to check the source code for security vulnerabilities and to fix them. Conversely, being able to see the source code means that such vulnerabilities cannot be completely obscured from public view.

It’s not a settled debate, nor do I intend to settle it, but it bears mentioning. Looking through the list of updates on a fresh Ubuntu install and the CERT vulnerability list, there are a number of potential vulnerabilities in various programs included with Ubuntu – Firefox for example has been patched for vulnerabilities seven times now. There are enough vulnerabilities that I don’t believe just counting them is a good way to decide if Ubuntu being open source has a significant impact on improving its security. Plus this comes full-circle with the notion of Ubuntu being practically secure (are there more vulnerabilities that people aren’t bothering to look for?), but nevertheless it’s my belief that being open source is a security benefit for Ubuntu here, even if I can’t completely prove it.

Because of the aforementioned ability to see and modify any and every bit of code in Ubuntu and its applications, Ubuntu also gains a security advantage in that it’s possible for users to manually patch flaws immediately (assuming they know how) and that with that ability Ubuntu security updates are pushed out just about as rapidly as humanly possible. This is a significant distinction from Windows and Patch Tuesday, and while Microsoft has a good business reason for doing this (IT admins would rather get all their patches at once, rather than testing new patches constantly) it’s not good technical reasoning. Ubuntu is more secure than Windows through the virtue of patching most vulnerabilities sooner than Windows.

Finally, looking at Ubuntu there are certainly areas for improvement with security. I’ve already touched on the firewall abilities, but sandboxing is the other notable weakness here. Windows has seen a lot of work put into sandboxing Internet Explorer so that machines cannot get hit with drive-by malware downloads, and it has proven to be effective. Both Internet Explorer and Google’s Chrome implement sandboxes using different methods, with similar results. Meanwhile Chrome is not ready for Linux, and Firefox lacks sandboxing abilities. Given the importance of the browser in certain kinds of malware infections, Ubuntu would benefit greatly from having Firefox sandboxed, even if no one is specifically targeting Ubuntu right now.

It’s Free – Libre Ubuntu – Long Term Support
Comments Locked

195 Comments

View All Comments

  • Kakao - Wednesday, August 26, 2009 - link

    Ryan, nowadays you don't need to dual boot. You can just set up a virtual machine. If you are a gamer use Windows as host and setup a Linux distro as guest. If you have enough memory, 4GB is very good, you can have both perfectly usable at the same time. I'm using Virtual Box and it works great.
  • VaultDweller - Wednesday, August 26, 2009 - link

    "Manufacturer: Canon"

    I think you mean Canonical.
  • Ryan Smith - Wednesday, August 26, 2009 - link

    It wasn't in our DB when I wrote the article, it was supposed to be added before it went live. Whoops.

    Thanks you.
  • Proteusza - Wednesday, August 26, 2009 - link

    I havent been able to read the whole cos I'm currently at work, but so far it seems good. Some people have been saying you should be testing 9.04, and I can see their point, but on the other hand, I agree that since 8.04 is the latest LTS release, it should be pretty stable still.

    Nonetheless, perhaps you could compare a later non LTS release to a service pack for Windows? I mean, there is some new functionality and some fixes. Granted, new versions of Ubuntu contain a lot more few features than Windows service packs.

    I agree that the 6 month release cycle is too fast. I dont develop for Ubuntu myself, but I imagine a lot of time will be wasted on preparing for release twice a year. I mean, theres a lot of testing, bugfixing and documentation to be done, and I would think if you would only did that once a year, you would have more time for development. Although, I guess the more changes you do in a release the more you should test, so maybe thats invalid.

    I've also never really liked the Linux filesystem and package manager idea. Granted, package managers especially have improved a lot lately, and personally I think we have Ubuntu to thank for that, with its huge focus on usability, which historically Linux hasnt cared at all about.

    I also dont like over reliance on the terminal/CLI. I dont like that there are certain things that can only be done with it. Its easier and faster for me to do things with a GUI, because we are visual creatures and a GUI is a much better way of displaying information than just plain text. I think until a lot of the Linux developers get over the idea that the CLI is "the only way to go", the GUI will be underdeveloped. As I said, its only recently that some Linux developers have actually bothered to try to get the various desktop managers up to scratch.

    The other thing I find interesting about Ubuntu, is the nerd rage that some Debian developers exhibit towards Ubuntu.

    Anyway... when 9.10 comes out, I would love to see your impressions of the difference.
  • R3MF - Wednesday, August 26, 2009 - link

    i thoroughly approve of AT running linux articles..........

    however i didn't bother to read this one as anything from Q2 2008 is of zero interest to me now.

    may i suggest a group-test to be published around Xmas of the following Q4 2009 distro releases:
    Ubuntu 9.04
    opensuse 11.2
    fedora 12 (?)
    Mandiva 2010

    that would be awesome AND relevant to your readers.
  • CityZen - Wednesday, August 26, 2009 - link

    I was one of those waiting for this article. I do remember getting excited when it was promised back in ... (can't recall the year, sorry, it's been too long :) ). Anyway, the wait seems to have been worth it. Excellent article.
    A suggestion for part 2: install LinuxMint 7 (apart from Ubuntu 9.04) and see which of the problems you found in part 1 with Ubuntu 8.04 are solved in LinuxMint "out of the box"
  • captainentropy - Tuesday, September 1, 2009 - link

    I totally agree! To hell with Ubuntu, Mint7 is the best linux distro by far. Before I settled on Mint I tried Ubuntu, Kubuntu, PCLinuxOS (my previous fave), Mepis, Scientific, openSUSE, Fedora, Slackware, CentOS, Mandriva, and RedHat. None could come close to the complete awesomeness, beauty, out-of-the-box completeness, and ease of use as Mint7.

    I'm a scientist and I'm using it for sequence and image analysis, so far.
  • haplo602 - Wednesday, August 26, 2009 - link

    so I got to page before installation and I have so many comments I cannot read further :-)

    I am using linux on and off as my main desktop system since redhat 6.0 (that's kernel 2.2 iirc) so some 10 years. my job is a unix admin. so I am obviously biased :-)

    1. virtual desktops - while this heavily depends on your workflow, it helps organise non-conflicting windows to not occupy the same space. I used to have one for IM/email, one with just web browser, one with my IDE and work stuff and one for GIMP and Blender. while this is my preference, it helps to kill the notification hell that is Windows. I hate how Windows steals focus from whatever I am working on just because some unimportant IM event just occured.

    2. package manager and filesystem. given my background, the linux FHS is my 2nd nature. however you failed to grasp the importance of the package manager here. it effectively hides the FHS from you so you do not need to clean up manualy after uninstall. all directories you should ever go into manualy are /etc, your home dir, the system mount directory and whatever the log directory is. If you need to acccess other directories manualy, then you are either a system developer, a programmer or too curious :-)

    also you can usualy one-click install .deb packages and they appear in the package manager as usual. just you have to manage dependencies manualy in that case. repositories are nice as you need to set them up ONCE and then all your updates/future versions are taken care of.

    3. missing executable icons - this has a lot more background to it but it is a mistake to use nautilus in the default icon mode. you basicaly cannot live withour ownership/permissions displayed on a unix system. trying to hide this in any way in a GUI is a capital mistake. that's why a windows explorer like file manager is not usable under linux. good old MC :-) anyway an executable file can be anything from a shell script to a binary file. you just have to have the correct launcher registered in the system and you can open anything. basicaly same as windows just not that much gui friendly.

    4. NVIDIA/ATI drivers - this is a story in itself. use NVIDIA if you want easy of use. use ATI if you want to learn about kernel and X :-) dig through phoronix.com for more info.

    ok I will post more comments as I read further :-)
  • haplo602 - Wednesday, August 26, 2009 - link

    so I read the whole article. I would have some more comments :-)

    1. installation - for me this was never a problem on any linux distro I was using. my partition scheme does not change much and it is usualy the trickiest part of the whole installation process. try out the full gentoo 3 stage installation if you want some fun (ok it is not avaiable via normal means anymore).

    2. fonts - as you mentioned with codecs, there are software restrictions and licensing policies governing linux distributions. ms fonts are licensed under different terms than GPL software. yes even FOTNS have licenses. so they are generaly not included in linux distributions by default.

    What I missed from the article is the amount of customisation you can do with a typical linux distro. just ubuntu has 3 main variants and you can mix and match them at will. you can even have all 3 installed and switch between the window managers by user preference.

    Since you did not like the package manager anyway, you missed on the main Linux strength - application variability.

    From a common user perspective however, the article is quite correct. I would expect more from a seasoned windows user and AT editor.
  • n0nsense - Wednesday, August 26, 2009 - link

    Ubuntu 8.04 is 14 months old creature.
    2 versions released after it and the third one should arrive in October.
    In terms of Windows it's short time, but for Linux it's a lot of time.
    I suggest your next review should be done on Ubuntu 9.10 instead of 9.04 (which IMHO is better than 8.04 but still lacks some polish).

    As mentioned before, the advantage of CLI instructions is that it will work on any Desktop Environment (Gnome, KDE, XFCE etc.) if it's not related to the DE itself. Moreover it will work on different versions (older/newer).
    For example in Vista/7 i couldn't find Network Connections in GUI.
    But who can stop me to type "Network Connections" in Explorer's address bar ? Sometimes GUI changed and even if only a little, most people will fail to follow screen shots. not to mention that most desktops are so customized (on real geek's computers) that it looks too different. I'm not talking about icons or desktop background. I'm talking about panels (if any at all), docks, menus, context menus etc. in Linux almost everything can be changed. And old-school geeks that had their Linux installations for years do this things so each DE is probably unique. (I have Gnome and apps settings/tweaks for over 7 years. Some of them probably never changed). The trick is that even when you reinstall the system, your personal setting may stay with you. (I jumped form Debian to Ubuntu to Gentto back to Ubuntu to Ubuntu x86_64 and finally to Gentoo x86_64). After all this, i have not lost any user customization/setting. On the system level it's harder since Debian and Gentoo are very different. All this gives you motivation to change and to tweak to make it better. Windows users are not really can customize and when they do, it's only valid until they have to reinstall/upgrade their OS. Since most of the Windows users I know reinstall at least once a year, after few cycles they will stay with defaults for both OS and applications.

    Switch to Linux is not the easiest thing. It's usually not "love from first sight" story. But if somehow you stayed around and get to know it, you can't be separated after :)
    Even on Windows 7 i feel handicapped in terms of usability and effectiveness/productivity. (I spend more time in front of Windows then Linux computers)

Log in

Don't have an account? Sign up now