mainframe
mainframe

Data Center Predictions for 2017

| By: (61)

As the end of the year approaches, I’ve been thinking about trends we are seeing today and how they will impact data center purchase decisions and the storage industry in 2017.  I’ve been following three industry shifts this year and I believe they will have major impact in 2017.

Cost of Capacity over Performance

For decades, data center managers have focused on the need for speed. As a result storage vendors focused R&D investment on delivering the highest performance possible.  Data Centers needed more capacity but application delivery requirements often drove the purchase of excess capacity to improve the performance limitations of spinning disks. 

Widespread availability of flash storage quickly changed storage performance and eliminated the practice of over provisioning HDDs for performance. Today, we are able to meet or exceed the storage performance requirements of most data centers.  This was noted in a recent research report published by Coughlin Associates and Objective Analysis titled How Many IOPS is Enough, which shows flat requirements for IOPS/GB over the last year for the first time since the study started in 2012.  This isn’t to say that performance requirements aren’t increasing, rather that they are now scaling linearly with capacity requirements. 

The ubiquitous availability of inexpensive flash storage that can address the majority of performance requirements enables the focus to shift from speed to capacity while radically improving storage density.  In 2017, the key metric for IT purchases will shift from IOPS, to $/GB and data reduction will be widely used to drive down costs across primary and secondary storage both on premises and in clouds.

Public and Private Cloud Coexistence

In 2010 “Hybrid Cloud” emerged as a marketing term, intended to advertise existing legacy infrastructure’s ability to evolve to support cloud-style applications alongside new workloads that were moving to the public cloud.  As cloud sophistication grew, awareness of relative efficiency and cost from public cloud services such as AWS changed the discussion.  Public cloud provider success quickly applied cost and efficiency pressure to traditional private data centers. 

Turn the clock forward  to 2016 and despite AWS’ evident success, private clouds and hosted clouds are still 75% of cloud infrastructure, according to 451 Research Market Monitor. More and more companies have adopted a “cloud first” policy for applications.  While public clouds will remain a portion of the total data center environment, businesses are requiring IT to complete a cost comparison between their internal clouds (private and hosted) and public cloud services.

Successfully coexisting with public clouds will require private data centers to build or buy their own true cloud infrastructures that employ virtualization, commodity hardware, open-source software, and data reduction technologies much like those utilized by the public cloud providers.  With a true cloud strategy in place, private data centers will be able to squeeze more capital and operational costs out of their internal service offerings and coexistence will persist alongside public clouds.

The Open Software Defined Data Center

A decade ago, implementing a Software Defined Data Center (SDDC) required development from scratch – a feat that could only be performed by organizations with huge software engineering organizations like Amazon.  Traditional IT organizations focused on supporting the data center servers, switches and storage equipment as they had in the previous decade .  Today this is changing with the wide availability of mature, open source software from Red Hat, Canonical and other vendors offering services supporting the software defined data center.  One example of an SDDC environment that is seeing wide adoption is OpenStack which, according to the 451 Group, is expected generate $1.8 billion in revenue in 2016 and grow to $5.7 billion by 2020.

Data Center managers reap huge economic advantages from vendor neutrality, hardware independence, and increased utilization, while still customizing for their own unique business requirements.   Mature open source technologies enable organizations to leverage hyper-scale economics in a more bespoke fashion, without giving up control to the more homogeneous public cloud environments. 

In 2016, the promise of increased data center efficiency led major private data centers across the globe into trials with open SDDC solutions.  By the end of 2017, you can bet that many of those same organizations will have moved these solutions into production and will be well on their way to reaping significant long-term benefits.

As 2017 approaches, capacity costs will dominate IT economics, public and private clouds will coexist and Software Defined Data Centers will deliver efficient cloud economics.   It’s going to be an exciting and eye-opening year!

By: (61)

Tom Cook

Comments are closed