Offshore Outsourcing will help you to reduce cost and enhance your productivity

Home About Us Services Partners Articles Classifieds Directory Contact Us
   
Offshoring
Outsourcing
BPO
Computers Networks
Internet
Operating Systems
Data Storage
Telecommunications
Programming
Software Engineering
Information Technology
Online Rights - Law
Business
E-Commerce
IT Outsourcing
Business Consulting
Finance & Accounting
Graphic Design
Web Services
Search Engine Optimization
Open Source
Hardware
Security
Others
Games

Conquering IT Complexity

Any software developer will attest that building new and more sophisticated features into a program is addictive. The dilemma is: Will software technology be bridled by consumers’ refusal to comprehend or use its capabilities to the fullest? Many views, one conclusion Information Technology is inherently complex, born out of some of the most progressive technologies devised by the human mind - from electricity to the Pentium IV chip. Views range on the issue of IT complexity, from the luminaries of the software industry to its everyday users, yet the general opinion is that simple interfaces are necessary. The success of the iPod not withstanding, Bill Gates thinks that people want multi-purpose devices rather than dedicated ones. He admits that the complexity built into the former category of IT appliances and software must be compensated by a very simple user interface, but building that simplicity means adding layers of software complexity that are hidden from the user. Paul Saffo, a technology visionary from the California Institute for the Future, talks of complexity phobia as “featuritis,” as do all those consumers who, according to a Microsoft survey, reveal that they only use 10% of the features in Microsoft Word. Many times, a seemingly simple computer task becomes overwhelmingly complicated and frustrating to carry out. “Everything I touch doesn’t work”, confesses John Maeda, a professor in computer design at MIT and a PhD holder in Interface Design. Maeda refers to plug-and-play devices he tries to get his computer to recognize. What hope is then left for the masses of consumers? Analogue vs. Digital Who hasn’t come across a scene like this: you have a purchase of $8.69, a bill of $20 and want to make the cashier’s life easier by tendering $20.69. The cashier, most likely a high-school if not a university student, quickly gets a calculator to figure out that the change is $12.00. This is the very generation some analysts, such as Pip Coburn of the Union Bank of Switzerland, label as “digital natives” - people born into today’s technologies, as opposed to the rest of us “analogues,” whose best hope is to become “digital immigrants”. These “digital natives” as well as those who, willingly or not, will have to adapt to an increasingly self-serve technological world, are drivers of the big demand for interface simplicity. Strong drivers for simplicity can also be found in the corporate world. In the years following the boom period in IT, investments in new systems have stalled, as earlier investments in “hot” technologies (e.g. ERP, CRM) had not delivered the anticipated benefits. A major reason for this falling short of expectations is complexity. Those systems were not fully understood by the users, and were not fully implemented by the IT departments. The Standish Group, a research firm tracking corporate IT purchases, concluded that 66% of all IT projects either fail or are implemented over-budget and behind-schedule, thanks to the complexity factor. Findings from Gartner and IDC, who studied the impact of complexity from the angle of network down-time and, respectively, budget spent on fixing IT failures, also conclude that IT must conquer complexity. “It’s the next huge thing”, says Andreas Kluth, a distinguished journalist for the Economist. The point is not lost on the IT industry, and its ambitions are very high. In 2002 IBM began speaking about the “on-demand” business model and autonomic computing. Other big firms followed with initiatives along the same line of thinking: IT systems must become self-configuring, self-diagnosing, self-healing and self-optimizing. In other words, the aim for the digital world is to emulate the analogue world. Living organisms are indeed capable of doing all these tasks in such a natural, apparently simple manner, that we don’t even think about how complicated all this is - it is just natural! Although there are many who express skepticism at these initiatives, IT will probably follow this course. Other technologies did it too in the past: the car was a much more complicated “animal,” when it was born more than a century ago. Today we can drive a car in blissful ignorance of its complexity, as long as we can manage the wheel and two pedals. Only when the car had reached a certain level of interface simplicity did it conquer the market and became ubiquitous. The same happened with other staples of modern civilization and their distribution system: electricity, water and food. The question now is, how long will it take for IT to become “simpler?” For an industry where a few months means history, to achieve in our lifetime what took nature billion of years, would be nothing short of a miracle.