The high profile outages that happened almost concurrently at the NYSE, United Airlines, and the Wall Street Journal this past summer, serve as a stark reminder of our universal dependence on software. Even the best run companies in the world are vulnerable because software is growing in complexity to the point where it is virtually unmanageable. Systems are so large, so byzantine, and in some cases, so old, that they are beyond the scope of any individual to understand. For companies that acquire or merge with other companies, the situation is even worse. Integrating two of these behemoth systems can take years and the result is rarely an improvement. Typically, the complexity is multiplied.
I can’t overstate how harmful complexity can be to an organization. A complex system makes it very difficult to predict circumstances that may result in an outage. We’ve seen that repeatedly. It undermines the security of the system since security professionals don’t understand the full attack surface or interaction of all of the components. Perhaps most importantly, complexity stifles innovation because every little change requires navigating a quagmire of code that is poorly understood, and wasn’t so much designed as accreted.
The biggest source of complexity in today’s enterprise software is a mismatch between the problem to be solved and the technology used to solve it. We think of IT as a very fast paced arena where technology and expectations change on a daily basis. That’s true, but the fact is that some of the core technologies in widespread use today were conceived over 30 years ago. We’re using rotary phones in the age of iPhones. We’re sticking with software designed in the 20th century for requirements that didn’t even come into existence until the 21st. It’s just not a good fit any more.
Sign in to read more.