curata__1MK8EV7S10cRfWt.png

Reduce Cloud’s Highly Redundant Data

| By: (60)

Storage is the foundation of cloud services. All cloud services – delineated as scalable, elastic, on-demand, and self-service – begin with storage. Almost universally, cloud storage services are virtualized and hybrid cloud architectures that combine on-premise resources with colocation, private and public clouds result in highly redundant data environments.  IDC’s FutureScape report finds “Over 80% of enterprise IT organizations will commit to hybrid cloud architectures, encompassing multiple public cloud services,…

Read more

curata__BM0mQtBrwzT8R4R.jpeg

Controlling Infrastructure and Software Costs in the Private Cloud

| By: (60)

Organizations choose private cloud deployments over public when they want to combine the increased flexibility of the cloud model with the privacy and security of their own control. Private cloud is also more scalable, flexible, agile, easy and efficient than traditional IT infrastructures of the past. As a result, IDC predicts in 2017 there will be $17.2B in infrastructure spending for private cloud, including money spent for on-premises and hosted…

Read more

curata__1b778fd225c6d175175f45b2f20b1559.png

Data Efficiency in Public Clouds

| By: (60)

Public cloud deployments deliver agility, flexibility and elasticity. This is why new workloads are increasingly deployed in public clouds.  Worldwide public IT cloud service revenue in 2018 is predicted to be $127B.  It’s powerful to spin up a data instance instantaneously, however managing workloads and storage still requires analysis, planning and monthly provisioning.  It would be extremely advantageous if public cloud storage capacity could automatically grow and condense to optimize…

Read more

mainframe

Data Center Predictions for 2017

| By: (60)

As the end of the year approaches, I’ve been thinking about trends we are seeing today and how they will impact data center purchase decisions and the storage industry in 2017.  I’ve been following three industry shifts this year and I believe they will have major impact in 2017. Cost of Capacity over Performance For decades, data center managers have focused on the need for speed. As a result storage…

Read more

Why Data Reduction Belongs in the Linux OS

| By: (60)

Data Reduction technologies (compression and deduplication) have been around for many years. Initially applied to backup storage environments, data reduction has more recently been applied to primary storage, where it’s heavily leveraged in flash arrays. The latest and most innovative application for data reduction, however, can now be found in the OS where it can be universally applied across IT environments to achieve greater data density and to meet cost…

Read more

oracle-logo

Oracle Cloud Changes Everything – Just ask Larry Ellison

| By: (60)

At Oracle OpenWorld last week, the company proudly proclaimed: “We will lead cloud SaaS, PaaS and now IaaS”. According to Larry Ellison, the #1 objective for Oracle is to be the lowest cost IaaS provider with the highest margin. As you may have guessed, their new target for leadership is IaaS and Amazon. You’ve got to admire Larry Ellison; he goes all in. Whether he is buying an island, taking…

Read more

curata__XjEAnOGTVyLM4FC.png

The Age of IT Efficiency Has Arrived

| datacenterpost.com
curata__XjEAnOGTVyLM4FC.png

Change is hard to explain, especially when you are in the middle of it.  Take the massive shift to cloud computing occurring today.  For quarters on end, major infrastructure vendors were missing their numbers and claiming that they were merely experiencing “temporary slowdowns” or “regionally soft demand.”   Rather than challenge these claims, the financial and industry analysts fell in line, suggesting turn-arounds were, “right around the corner.”  But, the turn-arounds didn’t happen.  While infrastructure giants were sailing directly into the wind, Amazon quietly expanded Amazon Web Services (AWS), Microsoft reinvented itself, VMware repositioned, and open source heavy-weights such as Red Hat, realigned products and service offerings around hybrid cloud business models.  The Hybrid Cloud exploded right before our eyes – and with it has come the Age of IT Efficiency.

IDC reports 82% of enterprises have a hybrid cloud strategy and are actively moving workloads to the cloud.  How did the shift to cloud happen so fast?  Simple.  Just follow the money.

Jeff Bezos of Amazon has said, “Your margin is my opportunity.”  The shift to cloud occurred and an entire ecosystem (not just Amazon) aligned to support the shift. Open software defined infrastructure vendors like Red Hat, SuSE and Canonical, white box hardware vendors such as Foxconn and Quanta, public cloud providers like Amazon, Microsoft and Google, and services companies like Mirantis came to provide low cost and highly efficient compute, network and storage solutions. As a result, hybrid cloud demand surged.  In fact, IDC estimates the hybrid cloud will account for 80% of IT spend by 2020. And owing to higher utilization rates, lower pricing, and greater density, hybrid cloud solutions cost a fraction of proprietary hardware products.

Hybrid cloud or more specifically open hybrid cloud is on the way to becoming the leading enterprise IT architecture, ushering in the Age of IT Efficiency.

Hybrid Cloud Changes Everything

The flight to IT efficiency started just 10 years ago with Amazon Web Services. Today, large public cloud services deliver extreme IT agility at significantly lower cost when compared to yesterday’s data centers.  Because of this flexibility and efficiency, many smaller organizations have moved entirely to the cloud to meet their day-to-day IT business needs.

While larger organizations, use the public cloud for some projects, they face challenges around data proximity, security, and long term project/investment lifecycles that make on-premises (and/or private hosting) data center infrastructure the correct fit for other applications.

As Fortune 1000 companies explore hybrid cloud, they discover that they can substantially lower their costs if they simply “do it like Amazon.” Like Amazon, they can build hybrid cloud data centers that derive efficiency from four key technologies: Virtualization, Open Source Software, White Box Hardware and Data Reduction.

-Virtualization drives up utilization rates by supporting more workloads per server.  The net effect is that cloud IT organizations are able to increase the density of their data centers saving substantial costs in real estate while gaining a tremendous amount of operational elasticity. The same hardware used at 2 p.m. for one workload, can be repurposed at 3 p.m. for another.

Open Source software and open collaboration via Open Source frameworks has established
a huge ecosystem of developers (spanning across industries and academia) driving innovation in massive scale-out infrastructures of the cloud data center. These projects are focused on scalability, reliability, security and manageability. OpenStack and Linux itself are two great examples of open source projects that contribute tremendously to cloud implementations.

The availability of commoditized “white box hardware” facilitates the cloud revolution.  In the past, traditional IT environments required “branded hardware” to ensure IT had the reliability and performance it needed to operate. Today, as industry analyst Howard Marks of DeepStorage.net notes, “If you care about whose equipment is in your cloud… you’re doin’ it wrong!” Advancements, both in commodity hardware components and software, have enabled cloud IT organizations to use lower cost white box hardware in the largest data centers on the planet. And, every year the cost of those components drop as competitive market forces and technical efficiency gains drive better economics. This phenomenon has enabled cloud data centers to build extremely cost effective and efficient IT infrastructures.

The final frontier of the hyper efficient data center is data reduction.  These data centers combine fast direct-attached storage solutions with modern object-based cloud storage facilities (low-cost, bulk data storage). As a result, software defined hybrid cloud deployments benefit substantially from data reduction that combines inline deduplication, compression, and fine-grained thin provisioning to increase data center density and dramatically decrease compute and storage costs in the hybrid cloud. The net result of increased density is that more data is stored in less space and consumes fewer resources reducing total costs.

From the storage perspective, hybrid cloud data centers are moving to multi-petabyte scale. At that level, savings isn’t just about spending less on HDD or flash.  Instead the big savings from data reduction are derived from increased data density. With data reduction optimizing density of existing data centers is simple, fast and far more compelling than bearing the cost of new data center space.  This density increase also dramatically cuts the cost of power, cooling and administration. Once the infrastructure is optimized for hybrid clouds with virtualization, open source operation software and white box servers, the next step in efficiency is modular data reduction!

The Data Center Density Challenge

High density data centers are part and parcel of the new hybrid cloud infrastructure landscape.  Data Center Journal’s What Does High Density Mean Today?  points out the challenges of high density data centers including questions about power and cost.  Gartner predicts that by 2015 50% of data centers will have high density zones enabled by high density storage arrays and servers. Are we already there?

Data Center Journal’s Is Cloud Computing Changing Data Centers? describes the economic drivers behind the data density issue by discussing IT infrastructure budget limitations, the variable cost of today’s capital expenditures for data storage and the business agility needs. It also highlights the power and cooling challenges as data centers continually expand to meet data storage needs that are beginning to reach critical mass.

Recent research from Peerless Business Intelligence highlights the importance of data reduction to high density data centers.  In Cloud Data Center Economics: Efficiency Matters, Lynn Renshaw discusses the need to “rise above” hardware and the physics of space and power/cooling and to look at the bigger picture of data center costs on a square foot basis.  While data centers are reaching power and cooling density limits, the cost per square foot of building data centers continues to increase and is becoming prohibitive for most businesses.  Today, there are over a billion square feet of data centers. As we continue to store more information, consume more storage and processor cycles and utilize more power, there are limited physical options available to increase data center density.

Renshaw takes us to the obvious next step of leveraging software and in specific data reduction software to store more data in less space, thereby reducing the square footage demand.  As her “back of the napkin” sample calculation demonstrates, cloud data centers can realize substantial space savings, by leveraging data reduction software. Her example shows how a 100,000 square foot facility can save over $74 million in costs. Data reduction software not only reduces the amount of data stored, it also lowers the number of storage arrays and as a result power/cooling costs and the square footage they consume in a data center.

Taking her thesis a step further, data reduction increases data center density and as a result reduces the need for data center construction!  At today’s costs of $3,000 a square foot, that’s a compelling argument!

Renshaw states the obvious: “Cloud growth is inevitable, but let’s do it with a smaller footprint.”

Conclusion

IT infrastructure is at an inflection point and change is all around! We saw infrastructure giants under extreme business pressure from hyper scale cloud providers that grabbed market share because they delivered lower price points, simplicity and business agility. The Age of IT Efficiency had arrived.

Led by Amazon, “the cloud” evolved rapidly as a business option for data storage and compute. As a result, open software players such as Red Hat, Canonical, and Mirantis (to mention a few) rose in prominence and are seeing rapid growth because they deliver efficiency in cost and operation and higher data density.

The hybrid cloud is now the implementation of choice for IT infrastructure because the combination of data in the public cloud and on-premises create a solution that delivers increased agility at the lowest cost. This has been enabled by white box hardware, software for virtualization, open source operating software and data reduction software. IT infrastructure will be open, flexible and highly efficient as The Age of IT Efficiency is now upon us!

Read more

curata__93b34be0683b1a0c2be6a4d79eda1865.png

The only comprehensive data reduction solution for Linux at Red Hat Summit ’16.

| By: (60)

      Yesterday we announced VDO for Hybrid Cloud, the only comprehensive data reduction solution for Linux, at Red Hat Summit ’16.  VDO is now available to REHL users and it’s so simple that you can install it and be up, running and saving space in minutes. VDO lowers TCO and makes OpenStack, Atomic containers and REV run better with typical 6x or more data reduction.  It is the…

Read more

curata__93b34be0683b1a0c2be6a4d79eda1865.png

$1.5 Trillion Reasons Cloud with Data Reduction Wins

| By: (60)

Experts say the hybrid cloud will house 80% of world data by 2020 and that Linux will be the leading OS – making open source THE platform of choice for the cloud era.  From our interactions with large enterprise customers as well as with telco/service providers, we can confirm that both trends are happening in the market. One metric we have for this is demand for data reduction in the…

Read more

curata__93b34be0683b1a0c2be6a4d79eda1865.png

The Age of IT Efficiency is here!

| By: (60)

Change is hard to explain when you are in the middle of it.  Take the massive shift to cloud computing that is occurring today.  For quarters on end, major infrastructure vendors missed their numbers and spoke of “temporary buying slow downs” or “regionally soft demand,” and financial and industry analysts fell in line and pointed to turn-arounds that were right around the corner.  But, the turnaround didn’t happen.  While the…

Read more