Revisiting Linux Part 1: A Look at Ubuntu 8.04
by Ryan Smith on August 26, 2009 12:00 AM EST- Posted in
- Linux
It’s Secure
Security is a tough nut to crack, both with respect to making something secure and judging something to be secure. I’m going to call Ubuntu secure, and I suspect that there’s going to be a lot of disagreement here. Nonetheless, allow me to explain why I consider Ubuntu secure.
Let’s first throw out the idea that any desktop OS can be perfectly secure. The weakest component in any system is the user – if they can install software, they can install malware. So while Ubuntu would be extremely secure if the user could not install any software, it would not be very useful to be that way. Ubuntu is just as capable as any other desktop OS out there when it comes to catching malware if the user is dedicated enough. The dancing pigs problem is not solved here.
Nevertheless, Ubuntu is more secure than other OSes (and let’s be frank, we’re talking about Windows) for two reasons. The first is for practical reasons, and the second is for technical reasons.
To completely butcher a metaphor here: if your operating system has vulnerabilities and no one is exploiting them, is it really vulnerable? The logical answer to that is “yes” and yet that’s not quite how things work. Or more simply put: when’s the last time you’ve seen a malware outbreak ravaging the Ubuntu (or any desktop Linux distro) community?
Apple often gets nailed for this logic, and yet I have a hard time disagreeing with it. If no one is trying to break into your computer, then right now, at this moment, it’s secure. The Ubuntu and Mac OS X user bases are so tiny compared to that of Windows that attacking anything but Windows makes very little sense from an attacker’s perspective.
It’s true that they’re soft targets – few machines run anti-virus software and there’s no other malware to fend off – but that does not seem to be driving any kind of significant malware creation for either platform. This goes particularly for Mac OS X, where security researchers have been warning about the complacent nature this creates, but other than a few proof of concept trojan horses, the only time anyone seems to be making a real effort to break into a Mac is to win one.
So I am going to call Ubuntu, with its smaller-yet user base and lack of active threats, practically secure. No one is trying to break into Ubuntu machines, and there’s a number of years’ worth of history with the similar Mac OS X that says it’s not going to change. There just aren’t any credible threats to be worried about right now.
With that said, there are plenty of good technical reasons too for why Ubuntu is secure; while it may be practically secure, it would also be difficult to break into the OS even if you wanted to. Probably the most noteworthy aspect here is that Ubuntu does not ship with any outward facing services or daemons, which means there is nothing listening that can be compromised for facilitating a fully automated remote code execution attack. Windows has historically been compromised many times through these attacks, most recently in October of 2008. Firewalls are intended to prevent these kinds of issues, but there is always someone out there that manages to be completely exposed to the internet anyhow, hence not having any outward facing services in the first place is an excellent design decision.
Less enthusing about Ubuntu’s design choices however is that in part because of the lack of services to expose, the OS does not ship with an enabled firewall. The Linux kernel does have built-in firewall functionality through iptables, but out of the box Ubuntu lets everything in and out. This is similar to how Mac OS X ships, and significantly different from how Windows Vista ships, which blocks all incoming connections by default. Worse yet, Ubuntu doesn’t ship with a GUI to control the firewall either (something Mac OS X does), which necessitates pulling down a 3rd party package or configuring it via CLI.
Operating System | Inbound | Outbound |
Windows Vista | All applications blocked, applications can request an open port | All applications allowed, complex GUI to allow blocking them |
Ubuntu 8.04 | All applications allowed, no GUI to change this | All applications allowed, no GUI to change this |
Mac OS X 10.5 | All applications allowed, simple GUI to allow blocking them | All applications allowed, no GUI to change this |
Now to be fair, even if Ubuntu had shipped with a GUI tool for configuring its firewall I likely would have set it up exactly the same as how I leave Mac OS X set up – all incoming connections allowed – nevertheless I find myself scratching my head. Host-based firewalls aren’t the solution to all that ails computer security, but they’re also good ideas. I would rather see Ubuntu ship like Vista does, with an active firewall blocking incoming connections.
Backwards compatibility, or rather the lack thereof, is also a technical security benefit for Ubuntu. Unlike Windows, which attempts to provide security and still support old software that pre-dates modern security in Windows, Ubuntu does not have any such legacy software to deal with. Since Linux has supported the traditional *nix security model from the get-go, properly built legacy software should not expect free reign of the system when running and hence be a modern vulnerability. This is more an artifact of previous design than a feature, but it bears mentioning as a pillar of total security.
Moving on, there is an interesting element of Ubuntu’s design being more secure, but I hesitate to call it intentional. Earlier I mentioned how an OS that doesn’t let a user install software isn’t very useful, but Ubuntu falls under this umbrella somewhat. Because the OS is based heavily around a package manager and signed packages, it’s not well-geared towards installing software outside of the package manager. Depending on how it’s packaged, many downloaded applications need to be manually assigned an executable flag before they can be run, significantly impairing the ability for a user to blindly click on anything that runs. It’s genuinely hard to run non-packaged software on Ubuntu, and in this case that’s a security benefit – it’s that much harder to coerce a user to run malware, even if the dancing pigs problem isn’t solved.
Rounding out the security underpinnings of Ubuntu, we have the more traditional mechanisms. No-eXecute bit support helps to prevent buffer overflow attacks, and Address Space Layout Randomization makes targeting specific memory addresses harder. The traditional *nix sudo security mechanism keeps software running with user privileges unless specifically authenticated to take on full root abilities, making it functionally similar to UAC on Vista (or rather, the other way around). Finally, Ubuntu comes with the AppArmor and SELinux security policy features that enable further locking down the OS, although these are generally overkill for home use.
There’s one last issue I’d like to touch on when it comes to technical security measures, and that’s the nature of open source software. There is a well-reasoned argument that open source software is more secure because it allows for anyone to check the source code for security vulnerabilities and to fix them. Conversely, being able to see the source code means that such vulnerabilities cannot be completely obscured from public view.
It’s not a settled debate, nor do I intend to settle it, but it bears mentioning. Looking through the list of updates on a fresh Ubuntu install and the CERT vulnerability list, there are a number of potential vulnerabilities in various programs included with Ubuntu – Firefox for example has been patched for vulnerabilities seven times now. There are enough vulnerabilities that I don’t believe just counting them is a good way to decide if Ubuntu being open source has a significant impact on improving its security. Plus this comes full-circle with the notion of Ubuntu being practically secure (are there more vulnerabilities that people aren’t bothering to look for?), but nevertheless it’s my belief that being open source is a security benefit for Ubuntu here, even if I can’t completely prove it.
Because of the aforementioned ability to see and modify any and every bit of code in Ubuntu and its applications, Ubuntu also gains a security advantage in that it’s possible for users to manually patch flaws immediately (assuming they know how) and that with that ability Ubuntu security updates are pushed out just about as rapidly as humanly possible. This is a significant distinction from Windows and Patch Tuesday, and while Microsoft has a good business reason for doing this (IT admins would rather get all their patches at once, rather than testing new patches constantly) it’s not good technical reasoning. Ubuntu is more secure than Windows through the virtue of patching most vulnerabilities sooner than Windows.
Finally, looking at Ubuntu there are certainly areas for improvement with security. I’ve already touched on the firewall abilities, but sandboxing is the other notable weakness here. Windows has seen a lot of work put into sandboxing Internet Explorer so that machines cannot get hit with drive-by malware downloads, and it has proven to be effective. Both Internet Explorer and Google’s Chrome implement sandboxes using different methods, with similar results. Meanwhile Chrome is not ready for Linux, and Firefox lacks sandboxing abilities. Given the importance of the browser in certain kinds of malware infections, Ubuntu would benefit greatly from having Firefox sandboxed, even if no one is specifically targeting Ubuntu right now.
195 Comments
View All Comments
ParadigmComplex - Wednesday, August 26, 2009 - link
I concur - while most of the article is quite good, Ryan really seemed to have missed quite a bit here. His analysis of it seemed rather limited if not misleading.Not everything *has* to be a package - I have various scripts strewn around, along with Firefox 3.6a1 and a bunch of other things without having them organized properly as .deb's with APT. The packaging system is convenient if you want to use it, but it is not required.
Additionally, Ryan made it seem as though everything must be installed through Synaptic or Add/Remove and that there where no stand-alone installers along the lines of Windows' .msi files. It's quite easy on Ubuntu to download a .deb file and double-click on it. In fact, it's much simpler then Windows' .msi files - there's no questions or hitting next. You just give it your password and it takes care of everything else.
The one area I agree with Ryan is that there needs to be an standardized, easy, GUI fashion to add a repository (both the address and key) to APT. I have no problems with doing things like >>/etc/apt/sources.list, but I could see where others may. I suspect this could be done through a .deb, but I've never seen it done that way.
Ryan Smith - Wednesday, August 26, 2009 - link
Something I've been fishing for here and have not yet seen much of are requests for benchmarks. Part 2 is going to be 9.04 (no multi-Linux comparisons at this point, maybe later) and I'd like to know what you guys would like to see with respect to performance.We'll have a new i7 rig for 9.04, so I'll be taking a look at a few system level things (e.g. startup time) along side a look at what's new between 8.04 and 9.04. I'll also be taking a quick look at some compiler stuff and GPU-related items.
Beyond that the board is open. Are there specific performance areas or applications that you guys would like to see(no laptops, please)? We're open to suggestions, so here's your chance to help us build a testing suite for future Linux articles.
cyriene - Monday, August 31, 2009 - link
I'd like to see differences between PPD in World Community Grid between various Windows and Linux distros.I never really see AT talk about WCG or other distributed computing, but I figure if I'm gonna OC the crap out of my cpu, I might as well put it to good use.
Eeqmcsq - Thursday, August 27, 2009 - link
Cross platform testing is pretty difficult, considering there are a multitude of different apps to accomplish the same task, some faster, some slower. And then there's the compiler optimizations for the same cross platform app as you mentioned in the article. However, I understand that from an end user's perspective, it's all about doing a "task". So just to throw a few ideas out there involving cross platform apps so that it's a bit more comparable...- Image or video conversion using GIMP or vlc.
- Spreadsheet calculations using the Open Office Calc app.
- Performance tests through VMware.
- How about something java related? Java compiling, a java pi calculator app, or some other java single/multi threaded test app.
- Perl or python script tests.
- FTP transfer tests.
- 802.11 b/g/whatever wireless transfer tests.
- Hard drive tests, AHCI. (I read bad things about AMD's AHCI drivers, and that Windows AHCI drivers were OK. What about in Ubuntu?)
- Linux software RAID vs "motherboard RAID", which is usually only available to Windows.
- Linux fat32/NTFS format time/read/write tests vs Windows
- Wasn't there some thread scheduling issues with AMD Cool and Quiet and Windows that dropped AMD's performance? What about in Linux?
While I'm brainstorming, here's a few tests that's more about functionality and features than performance:
- bluetooth connectivity, ip over bluetooth, etc
- printing, detecting local/network printers
- connected accessories, such as ipods, flash drives, or cameras through usb or firewire
- detecting computers on the local network (Places -> Network)
- multi channel audio, multi monitor video
Just for fun:
- Find a Windows virus/trojan/whatever that deletes files, unleash it in Ubuntu through Wine, see how much damage it does.
Veerappan - Thursday, August 27, 2009 - link
I know you've said in previous comments that using Phoronix Test Suite for benchmarking different OSes (e.g. Ubuntu vs Vista) won't work because PTS doesn't install in Windows, but you could probably use a list of the available tests/suites in PTS as a place to get ideas for commonly available programs in Windows/OSX/Linux.I'm pretty sure that Unigine's Tropics/Sanctuary demos/benchmarks are available in Windows, so those could bench OpenGL/graphics.
Maybe either UT2004 or some version of Quake or Doom 3 would work as gaming benchmarks. It's all going to be OpenGL stuff, but it's better than nothing. You could also do WoW in Wine, or Eve under Wine to test some game compatibility/performance.
Once you get VDPAU working, I'd love to see CPU usage comparisons between windows/linux for media playback of H.264 videos. And also, I guess, a test without VDPAU/VAAPI working. Too bad for ATI that XvBA isn't supported yet... might be worth mentioning that in the article.
You also might want to search around for any available OpenCL demos which exist. Nvidia's newest Linux driver supports OpenCL, so that might give you a common platform/API for programs to test.
I've often felt that DVD Shrink runs faster in Wine than in Windows, so the time to run a DVD rip would be nice, but might have legal implications.
Some sort of multitasking benchmark would be nice, but I'm not sure how you'd do it. Yeah, I can see a way of writing a shell script to automatically launch multiple benchmarks simultaneously (and time them all), but the windows side is a little tougher to me (some sort of batch script might work). Web Browsing + File Copy + Transcoding a video (or similar).
Ooh... Encryption performance benchmarks might be nice. Either a test of how many PGP/GPG signs per second, or copying data between a normal disk partition, and a TrueCrypt partition. The TrueCrypt file copy test would be interesting to me, and would cover both encryption performance and some disk I/O.
One last suggestion: Folding@Home benchmarks. F@H is available at least in CPU-driven form in Windows/Linux, and there's F@H benchmark methodologies already developed by other sites (e.g. techreport.com's CPU articles).
Well, that's enough for now. Take or leave the suggestions as you see fit.
haplo602 - Thursday, August 27, 2009 - link
you are out of luck here ... linux does not compare to windows because they are both different architectures. you already did what you could in the article.especialy in a binary distribution like Ubuntu, compilation speed tests are meaningless (but Gentoo folks would kiss your feet for that).
boot up times are also not usefull. the init scripts and even init mechanisms are different from distro to distro.
compression/filesystem benchmarks are half way usable. on windows you only have NTFS these days. on linux there are like 20 different filesystems that you can use (ext3/4, reiser, jfs and xfs are the most used. also quite many distros offer lvm/evms backends or software raid.
I do not think there is much benchmarking you can do that will help in linux vs windows, even ubuntu vs windows because the same benchmars will differ between ubuntu versions.
the only usable types are wine+game vs windows+game, native linux game vs same windows game (mostly limited to unreal and quake angines), some povray/blender tests and application comparisons (like you did with the firefox javascript speed).
GeorgeH - Wednesday, August 26, 2009 - link
Not really a benchmark per se, but I'd be curious to know how the stereotypes of Windows being bloated and Ubuntu being slim and efficient translate to power consumption. Load and idle would be nice, but if at all possible I’d be much more curious to see a comparison of task energy draw, i.e. not so much how long it takes them to finish various tasks, but how much energy they need to finish them.In know that’d be a very difficult test to perform for what will probably be a boring and indeterminate result, but you asked. :)
ioannis - Wednesday, August 26, 2009 - link
is there some kind of cross platform test that can be done to test memory usage? Maybe Firefox on both platforms? not sure.By "no laptops", I presume you mean, no battery tests (therefore power and as a consequence, heat). That would have been nice though. Maybe for those looking for a 'quiet' setup.
but yes, definitely GPU (including video acceleration) and the GCC vs Visual Studio vs Intel compiler arena (along with some technical explanation why there are such huge differences)
ParadigmComplex - Wednesday, August 26, 2009 - link
If you can, find games that are reported to work well under WINE and benchmark those against those running natively in Windows. It'd be interesting to see how the various differences between the two systems, and WINE itself, could effect benchmarks.Fox5 - Wednesday, August 26, 2009 - link
Number 1 use of Ubuntu is probably going to be for netbooks/low end desktops for people who just wanna browse the net.In that case, the browsing experience (including flash) should be investigated.
Boot up time is important.
Performance with differing memory amounts would be nice to see (say 256MB, 512MB, 1GB, and 2GB or higher). Scaling across cpus would be nice.
Ubuntu as a programming environment versus windows would be good to see, including available IDEs and compiler and runtime performance.
Ubuntu as a media server/HTPC would be good to see. Personally, I have my windows box using DAAP shares since Ubuntu interfaces with it much better than Samba. And as an HTPC, XBMC and Boxee are nice, cross-platform apps.
Finally, how Ubuntu compares for more specific applications. For instance, scientific computing, audio editing, video editing, and image manipulation. Can it (with the addition of apps found through it's add/remove programs app) function well-enough in a variety of areas to be an all-around replacement for OSX or Windows?
Speedwise, how do GIMP and Photoshop compare doing similar tasks? Is there anything even on par with Windows Movie Maker?
What's Linux like game wise? Do flash games take a noticeable performance hit? Are the smattering of id software games and quake/super mario bros/tetris/etc clones any good? How does it handle some of the more popular WINE games?