Offshore Outsourcing will help you to reduce cost and enhance your productivity

Home About Us Services Partners Articles Classifieds Directory Contact Us
   
Offshoring
Outsourcing
BPO
Computers Networks
Internet
Operating Systems
Data Storage
Telecommunications
Programming
Software Engineering
Information Technology
Online Rights - Law
Business
E-Commerce
IT Outsourcing
Business Consulting
Finance & Accounting
Graphic Design
Web Services
Search Engine Optimization
Open Source
Hardware
Security
Others
Games

2006 Operating System Vulnerability Summary

Overview

     Computer security is a precarious business both from a product development and administrative standpoint. Operating system vendors are forced to constantly patch their software to keep consumers protected from the latest digital threats. But which operating systems are the most secure? A recent report by Symantec hints that Windows currently presents fewer security holes than its commercial competitors. To that, a typical consultant would respond "well, that depends," as security auditors generally take such statements with a grain of salt. It depends on the configurations of the hosts, the breadth of the included binaries and the scope of what "commercial competitors" entails. Differing opinions on this interpretation lead to different conclusions. SecurityFocus, for instance, shows that various overall vulnerabilities surged in 2006 while ISS (Internet Security Systems) reports that operating system specific exploits declined.The summarized coverage of 2006 vulnerabilities by SANS showed the most prevalent attack vectors were not directly against the operating systems themselves.However, this article approaches the operating system as an entity in and of itself for analysis of only the vulnerabilities of core features. As such, vulnerability scans were conducted against 2006's flagship operating systems in various configurations to determine weakness from the moment of installation throughout the patching procedure. From Microsoft, testing included Windows XP, Server 2003 and Vista Ultimate. Examinations against Apple included Mac OS9, OSX Tiger and OSX Tiger server. Augmenting Apple's UNIX representation, security tests were also performed on FreeBSD 6.2 and Solaris 10. Rounding up the market share, Linux security testing included Fedora Core 6, Slackware 11, SuSE Enterprise 10 and Ubuntu 6.10. Before delving into the specifics of the vulnerabilities, it is helpful to understand the security scene of 2006.

Hacking and Zero Day

     The nature of malicious computer hacking has changed over the years, but one variable remains somewhat fixed in terms of availability and prediction - attack and penetration. Passive attacks have almost completely changed their genre while active attacking relies on a simple basic premise: the host computer is vulnerable no matter how security conscious the end user is. As the demand for underground, professional hacking rises, the need to build and maintain a network of zombie hosts requires more than relying on users to infect their system. Active attacks are a necessity for this Internet subculture and require close attention to the latest, and most longstanding, remotely accessible vectors.

     Which leads into Zero Day bug disclosure, a hot topic in security circles. Many security researchers argue that disclosing Zero Day vulnerabilities forces vendors to hasten patching, thereby shortening exposure time. Vendors counter with the argument that immediate disclosure without patch development time creates an exposure window through which consumers are needlessly put at risk. The term Zero Day Exploit is certainly real, meaning that almost as soon as a vulnerability is exposed, the exploit code is released into the wild. Tools like Metasploit make automating such tasks even easier. It is almost too obvious how 2006 became the year such subversive techniques became so widespread.

Evolution of 2006

     Hacking into computers is a practice that predates the Internet. The past decade of interconnectivity between the masses has merely hastened the pace. Furthermore, during the Internet's infancy, connected hosts were limited in number and tended to reside in the realm of academics, science or computer professionals. Even with modem-only access through such relics as Prodigy and AOL, Internet growth boomed in the late 90s. Today's ubiquitous broadband and common network-ready appliances has further accelerated growth. All of these new hosts create a remarkably large sandbox for hackers to toy with and hide in.

     Script kiddies were (and still are) an embarrassment to the hacking community, but the "anybody can hack" tools used by script kiddies pushed the development of more powerful, all-encompassing and fully automated hacking tools. Countless hackers and miscreants were using automated tools to constantly probe and penetrate the millions of computers left online 24/7. By 2004, the average unprotected computer was compromised in less than a minute, sometimes as quickly as twenty seconds. Very often just after a fresh installation, a host can be compromised by an automated tool before the real user even had a chance to log in for the first time. Rootkits were the buzzword of 2005 despite their long history. Sony BMG's ill-famed rootkit fiasco from their DRM protected audio CDs brought the term into the general public's eye. The very nature of a rootkit to evade detection and remain dormant until needed essentially drove the software behind the zombie networks of 2006.

     In 2006, the usual menagerie of vulnerabilities and attacks were ever present. E-mail viruses, websites hosting malicious scripts, credentials phishing and network worms continued to make their rounds. An enterprising niche, however, adapted concepts from every realm of software maliciousness to add a new twist to an old practice. Where automated tools left off in the past, professional robot networks (aka "bot-nets") picked up the slack. The route to infection followed one of the typical, aforementioned mainstream attack vectors. Once captured, the code utilized rootkit techniques to remain hidden from the more sophisticated and layered software packages that finally became more commonplace on home networks. Herein lies the twist. In the past, compromised computers were typically used as zombie gateways through which focused attacks were launched anonymously against targets for private information, corporate secrets, etc. They were also used as relay machines for e-mail spamming to keep the delivery source dynamic in an effort to thwart spam detection. In 2006, the compromised hosts became a network of distributed computers with downloadable controls to make it act as a single, massive attack system. These bot-net services were offered for sale, typically for launching monstrous "Denial of Service" attacks or to perform advanced penetrations where each attack comes from a different network segment to facilitate NIDS evasion.