The Year Ahead: Software (-based Infrastructure) Trumps Hardware

One of 2013's big stories is certain to be how software-based infrastructure continues to transform the datacenter. This is a timely topic as mandates like the Federal Data Center Consolidation Initiative (FDCCI) and Cloud First are forcing federal agencies to reevaluate how they host, provision and manage their enterprise systems. The renewed cost pressures facing agencies today are an even bigger impetus – sustainment costs must be reduced for agencies to continue to invest in new innovation.

Probably the biggest change in the datacenter over the past five years has been the widespread adoption of virtualization to dramatically improve server utilization. While consolidation and rationalization remain important strategies, virtualization is the force multiplier technology that has allowed IT to keep pace with escalating demand.

In 2013, agencies need to extend this paradigm to their entire infrastructure, including storage, networking and security systems. By replacing proprietary platforms with software-based systems that capitalize on commoditized hardware, they can create a more automated, manageable and intelligent infrastructure. Beyond just cost savings, these advantages include greater agility, faster provisioning, better fault tolerance and quicker recovery, and the ability to optimize performance for specific tasks.

Note that I haven't used the term "cloud" yet. It may be a subtle distinction, but this is actually the business strategy enabled by this infrastructure. Features like common data, consumption pricing and dynamic provisioning with high elasticity change how we meet our clients' requirements (yes, even internal clients), allowing for greater flexibility and personalization. The trade-off is that service-level agreements (SLAs) will become more of a two-way street as users need to assume shared responsibility for defining appropriate expectations. For cloud to take-off in 2013, educating users to the need for this type of cost/benefit analysis must be a focus.

 

Managing More Dynamic Technology

What's clear is that the datacenter is becoming more dynamic. Beyond the infrastructure itself, new service-oriented and event-driven architectures and more widespread use of mobile and composite applications (aka point solutions) means that systems are changing more frequently. With this in mind, here are a few issues to consider in 2013:

  • Don't Pass on PaaS – While often positioned as a starting point for cloud computing, Infrastructure-as-a-Service (IaaS) isn't significantly different from virtualized infrastructure as it presents many of the same management challenges. Instead, Platform-as-a-Service (PaaS) is the more likely game-changer. With PaaS, you can create on-demand general support systems to quickly provision a variety of applications. PaaS allows you to substantiate new instances more readily, reduce complexity through standardization, and avoid vendor lock-in by becoming more agnostic in terms of proprietary requirements.
  • Prepare for End-Point Diversity – With mobile accounting for up to half of new application development investments, there is a real need for mobile device management solutions that can deliver an optimal user experience (and necessary governance) across any device. What's compounding this challenge are the thousands of legacy applications that don't support touch interfaces. In most cases, it isn't cost-effective to fully rearchitect these solutions. Rather, consider client virtualization to provide mobile users with enhanced access to these applications.
  • Driving to the Right Metrics – One of the benefits of a software-based infrastructure is the enhanced ability to measure performance. At the most strategic-level, this can allow agencies to replace their existing CapEx model with an OpEx one. Beyond costs, what other metrics are important to your users? That's one of the most important questions that you can ask in 2013 as cost pressures will force tradeoffs. To be successful, you need to ensure that you are prioritizing the objectives that are truly important to your users and reducing investments in those that are simply table stakes.
  • Data Soup Becomes Murky – One of the strategic benefits of cloud computing is the potential to consolidate and share data across applications for cost savings and improved data integrity. However, with every opportunity, new challenges arise. In this case, it is maintaining the data's contextual history – its pedigree and lineage – across a number of iterations. In some cases, you will want to make this information readily available, but in others (e.g., national security, healthcare), you may not. NSA's Accumulo open-source platform is one example of how agencies may choose to work through this challenge.
  • The War for Talent Takes Off – The number of STEM graduates hasn't kept pace with technology's growing preponderance in our society. A real shortage of qualified technologists exists and it is likely to get worse before it gets better as technology is becoming more complex and divergent. This means that competition for top-tier talent is fierce. With a recovering economy, we're no longer competing between just government and contractors, but with many other industries. While this will impact some agencies desire to insource, for others it will be a real challenge simply to keep the lights on.

Agencies must ultimately address two important questions: Can I turn my infrastructure into a competitive advantage and if not, can I eliminate it? What we're seeing is that while hardware is becoming increasingly commoditized, the management layer is becoming more sophisticated. Investments and commitment here will pay ongoing dividends in the years to come.

 
John Conley
John Conley
Director, Market Development

 






Federal Technology Blogger Badge