DR Journal

Cloud Economics drive the IT Infrastructure of Tomorrow

| Welcome to Disaster Recovery Journal
DR Journal

The cloud continues to dominate IT as businesses make their infrastructure decisions based on cost and agility. Public cloud, where shared infrastructure is paid for and utilized only when needed, is the most popular model today. However, more and more organizations are addressing security concerns by creating their own private clouds. As businesses deploy private cloud infrastructure, they are adopting techniques used in the public cloud to control costs. Gone are the traditional arrays and network switches of the past, replaced with software-defined data centers running on industry standard servers.

Efficiency features make the cloud model more effective by reducing costs and increasing data transfer speeds. One such feature, which is particularly effective in cloud environments is inline data reduction. This is a technology that can be used to lower the costs of data in flight and at rest. In fact, data reduction delivers unique benefits to each of the cloud deployment models.

Public Clouds

The public cloud’s raison d’etre is its ability to deliver IT business agility, deployment flexibility and elasticity. As a result, new workloads are increasingly deployed in public clouds.  Worldwide public IT cloud service revenue in 2018 is predicted to be $127B.  

Data reduction technology minimizes public cloud costs. For example, deduplication and compression typically cut capacity requirements of block storage in enterprise public cloud deployments by up to 6:1.  These savings are realized in reduced storage consumption and operating costs in public cloud deployments.   

Consider AWS costs employing data reduction;

If you provision a 300 TB of EBS General Purpose SSD (gp2) storage for 12 hours per day over a 30 day month in a region that charges $0.10 per GB-month, you would be charged $15,000 for the storage.

With data reduction, that monthly cost of $15,000 would be reduced to $2,500.  Over a 12 month period you will save $150,000.   Capacity planning is a simpler problem when it is 1/6th its former size.  Bottom line, data reduction increases agility and reduces costs of public clouds.

One data reduction application that can readily be applied in public cloud is Permabit’s Virtual Disk Optimizer (VDO) which is a pre-packaged software solution that installs and deploys in minutes on Red Hat Enterprise Linux and Ubuntu LTS Linux distributions. To deploy VDO in Amazon AWS, the administrator provisions Elastic Block Storage (EBS) volumes, installs the VDO package into their VMs and applies VDO to the block devices represented for their EBS volumes.  Since VDO is implemented in the Linux device mapper, it is transparent to the applications installed above it.

As data is written out to block storage volumes, VDO applies three reduction techniques:

  1. Zero-block elimination uses pattern matching techniques to eliminate 4 KB zero blocks

  2. Inline Deduplication eliminates 4 KB duplicate blocks

  3. HIOPS Compression™ compresses remaining blocks 

cloud1

This approach results in remarkable 6:1 data reduction rates across a wide range of data sets. 

Private Cloud

Organizations see similar benefits when they deploy data reduction in their private cloud environments. Private cloud deployments are selected over public because they offer the increased flexibility of the public cloud model but keep privacy and security under their own control. IDCpredicts in 2017 $17.2B in infrastructure spending for private cloud, including on-premises and hosted private clouds.

One problem that data reduction addresses for the private cloud is that, when implementing private cloud, organizations can get hit with the double whammy of hardware infrastructure costs plus annual software licensing costs. For example, Software Defined Storage (SDS) solutions are typically licensed by capacity and their costs are directly proportional to hardware infrastructure storage expenses. Data reduction decreases storage costs because it reduces storage capacity consumption. For example, deduplication and compression typically cut capacity requirements of block storage in enterprise deployments by up to 6:1 or approximately 85%.

Consider a private cloud configuration with a 1 PB deployment of storage infrastructure and SDS. Assuming a current hardware cost of $500 per TB for commodity server-based storage infrastructure with datacenter-class SSDs and a cost of $56,000 per 512 TB for the SDS component, users would pay $612,000 in the first year. In addition, software subscriptions are annual, over three years you will spend $836,000 for 1 PB of storage and over five years, $1,060,000.

The same configuration with 6:1 data reduction in comparison over five years will cost $176,667 for hardware and software resulting in $883,333 in savings. And that’s not including the additional substantial savings in power cooling and space. As businesses develop private cloud deployments, they must be sure it has data reduction capabilities because the cost savings are compelling.

When implementing private cloud on Linux, the easiest way to include data reduction is with Permabit Virtual Data Optimizer (VDO). VDO operates in the Linux kernel as one of many core data management services and is a device mapper target driver transparent to persistent and ephemeral storage services whether the storage layers above are providing object, block, compute, or file based access.

VDO – Seamless and Transparent Data Reduction

cloud2

The same transparency applies to the applications running above the storage service level. Customers using VDO today realize savings up to 6:1 across a wide range of use cases.

Some workflows that benefit heavily from data reduction are;

  • Logging: messaging, events, system and application logs

  • Monitoring: alerting, and tracing systems

  • Database: databases with textual content, NOSQL approaches such as MongoDB and Hadoop

  • User Data: home directories, development build environments

  • Virtualization and containers: virtual server, VDI, and container system image storage

  • Live system backups: used for rapid disaster recovery

With data reduction, cumulative cost savings can be achieved across a wide range of use cases which makes data reduction so attractive for private cloud deployments.

Reducing Hybrid Cloud’s Highly Redundant Data

Storage is at the foundation of cloud services and almost universally data in the cloud must be replicated for data safety. Hybrid cloud architectures that combine on-premise resources (private cloud) with colocation, private and multiple public clouds result in highly redundant data environments. IDC’s FutureScape report finds “Over 80% of enterprise IT organizations will commit to hybrid cloud architectures, encompassing multiple public cloud services, as well as private clouds by the end of 2017.” (IDC 259840)

Depending on a single cloud storage provider for storage services can risk SLA targets. Consider the widespread AWS S3 storage errors that occurred on February 28th 2017, where data was not available to clients for several hours. Because of loss of data access businesses may have lost millions of dollars of revenue. As a result today more enterprises are pursuing a “Cloud of Clouds” approach where data is redundantly distributed across multiple clouds for data safety and accessibility. But unfortunately, because of the data redundancy, this approach increases storage capacity consumption and cost.

That’s where data reduction comes in. In hybrid cloud deployments where data is replicated to the participating clouds, data reduction multiplies capacity and cost savings. If 3 copies of the data are kept in 3 different clouds, 3 times as much is saved. Take the private cloud example above where data reduction drove down the costs of a 1 PB deployment to $176,667, resulting in $883,333 in savings over five years. If that PB is replicated in 3 different clouds, the savings would be multiplied by 3 for a total savings of $2,649,999.

Permabit’s Virtual Data Optimizer (VDO) provides the perfect solution to address the multi-site storage capacity and bandwidth challenges faced in hybrid cloud environments. Its advanced data reduction capabilities have the same impact on bandwidth consumption as they do on storage and translates to a 6X reduction in network bandwidth consumption and associated cost.  Because VDO operates at the device level, it can sit above block-level replication products to optimize data before data is written out and replicated.

Summary

IT professionals are finding that the future of IT infrastructure lies in the cloud. Data reduction technologies enable clouds – public, private and hybrid to deliver on their promise of safety, agility and elasticity at the lowest possible cost making cloud the deployment model of choice for IT infrastructure going forward.”

Read more

curata__1MK8EV7S10cRfWt.png

Cloud Economics drive the IT Infrastructure of Tomorrow

| ITBusinessNet.com
curata__1MK8EV7S10cRfWt.png

Cloud Economics drive the IT Infrastructure of Tomorrow

The cloud continues to dominate IT as businesses make their infrastructure decisions based on cost and agility. Public cloud, where shared infrastructure is paid for and utilized only when needed, is the most popular model today. However, more and more organizations are addressing security concerns by creating their own private clouds. As businesses deploy private cloud infrastructure they are adopting techniques used in the public cloud to control costs. Gone are the traditional arrays and network switches of the past, replaced with software-defined data centers running on industry standard servers.

Efficiency features make the cloud model more effective by reducing costs and increasing data transfer speeds. One such feature, which is particularly effective in cloud environments, is inline data reduction. This is a technology that can be used to lower the costs of data in flight and at rest. In fact, data reduction delivers unique benefits to each of the cloud deployment models.

Public Clouds

The public cloud’s raison d’etre is its ability to deliver IT business agility, deployment flexibility and elasticity. As a result new workloads are increasingly deployed in public clouds.  Worldwide public IT cloud service revenue in 2018 is predicted to be $127B.

Data reduction technology minimizes public cloud costs. For example, deduplication and compression typically cut capacity requirements of block storage in enterprise public cloud deployments by up to 6:1.  These savings are realized in reduced storage consumption and operating costs in public cloud deployments.

Consider AWS costs employing data reduction;

If you provision a 300TB of EBS General Purpose SSD (gp2) storage for 12 hours per day over a 30 day month in a region that charges $0.10 per GB-month, you would be charged $15,000 for the storage.

With data reduction, that monthly cost of $15,000 would be reduced to $2,500.  Over a 12 month period you will save $150,000.   Capacity planning is a simpler problem when it is 1/6th its former size.  Bottom line, data reduction increases agility and reduces costs of public clouds.

One data reduction application that can readily be applied in public cloud is Permabit’s Virtual Disk Optimizer (VDO) which is a pre-packaged software solution that installs and deploys in minutes on Red Hat Enterprise Linux and Ubuntu LTS Linux distributions.  To deploy VDO in Amazon AWS, the administrator provisions Elastic Block Storage (EBS) volumes, installs the VDO package into their VMs and applies VDO to the block devices represented for their EBS volumes.  Since VDO is implemented in the Linux device mapper, it is transparent to the applications installed above it.

To READ the complete article;

CLICK ON THE LINK BELOW

Read more

curata__72Bb2tTnqExjWmD.jpeg

Using Data Reduction at the OS layer in Enterprise Linux Environments

| Stock Market
curata__72Bb2tTnqExjWmD.jpeg

Enterprises and cloud service providers that have built their infrastructure around Linux should deploy data reduction in the operating system to drive costs down, say experts at Permabit Technology Corporation, the company behind Permabit Virtual Data Optimizer (VDO).  Permabit VDO is the only complete data reduction software for Linux, the world’s most popular server Operating System (OS). Permabit’s VDO software fills a gap in the Linux feature set by providing a cost effective, alternative to the data reduction services delivered as part of the two other major OS platforms – Microsoft Windows and VMware. IT architects are driven to cut costs as they build out their next generation infrastructure with one or more of these OS platforms in  public and/or private cloud deployments and one obvious way to do so is with data reduction.

When employed as a component of the OS, data reduction can be applied universally without lock-in of proprietary solutions. Adding compression, deduplication, and thin provisioning to the core OS, data reduction benefits can be leveraged by any application or infrastructure services running on that OS. This ensures that savings accrue across the entire IT infrastructure, delivering TCO advantages no matter where the data resides. This is the future of data reduction – as a ubiquitous service of the OS.

“We’re seeing movement away from proprietary storage solutions, where data reduction was a key differentiated feature, toward OS-based capabilities that are applied across an entire infrastructure,” said Tom Cook, Permabit CEO.  “Early adopters are reaping financial rewards through reduced cost of equipment, space, power and cooling. Today we are also seeing adoption of data reduction in the OS by more conservative IT organizations who are driven to take on more initiatives with tightly constrained IT budgets.”

VDO, with inline data deduplication, HIOPS Compression®, and fine-grained thin provisioning, is deployed as a device-mapper driver for Linux. This approach ensures compatibility with a full complement of direct-attached/ephemeral, block, file and object interfaces. VDO data reduction is available for Red Hat Enterprise Linux and Canonical Ubuntu Linux LTS distributions.

Advantages of in-OS data reduction technology include:

  • Improved density for public/private /hybrid cloud storage, resulting in lower storage and service costs
  • Vendor independent to function across hardware running the target OS
  • Seamless data mobility between on-premise and cloud resources
  • Up to six times lower IT infrastructure OpEx
  • Transparent to end users accessing data
  • Requires no modifications to existing applications, file systems, virtualization features, or data protection capabilities

With VDO, these advantages are being realized on Linux today. VDO deployments have been completed (or are currently in progress) with large telecommunications companies, government agencies, financial services firms and IaaS providers who have standardized on Linux for their data centers. With data reduction in Linux, enterprises achieve vendor independence across all Linux based storage, increased mobility of reduced data and hyper scale economics. What an unbeatable combination!

Read more

curata__UOYLNvle4ThKgip.jpeg

Addressing Bandwidth Challenges in the Hybrid Cloud

| By: (53)

Any application infrastructure that relies on a single data center is only as safe as that data center’s physical resources and the competence of its staff.  Witness the recent S3 outage at Amazon. When you choose to deploy in a single public cloud, you are delegating infrastructure management to your provider. When you’re exclusively running in-house, private cloud infrastructure, you’re entrusting that management to your own organization.  Either way mistakes…

Read more

curata__1MK8EV7S10cRfWt.png

Reduce Cloud’s Highly Redundant Data

| By: (60)

Storage is the foundation of cloud services. All cloud services – delineated as scalable, elastic, on-demand, and self-service – begin with storage. Almost universally, cloud storage services are virtualized and hybrid cloud architectures that combine on-premise resources with colocation, private and public clouds result in highly redundant data environments.  IDC’s FutureScape report finds “Over 80% of enterprise IT organizations will commit to hybrid cloud architectures, encompassing multiple public cloud services,…

Read more

Hybrid Cloud

Hybrid Cloud Gains in Popularity, Survey Finds

| Light Reading
Hybrid Cloud

The hybrid model of cloud computing is gaining more popularity in the enterprise, as businesses move more workloads and applications to public cloud infrastructures and away from private deployments.

Those are some of the findings from RightScale’s annual “State of the Cloud” report, which the company released Wednesday. It’s based on interviews with 1,000 IT professionals, with 48% of them working in companies with more than 1,000 employees.

The biggest takeaway from the report is that enterprises and their IT departments are splitting their cloud dollars between public and private deployments, and creating demands for a hybrid approach.

“The 2017 State of the Cloud Survey shows that while hybrid cloud remains the preferred enterprise strategy, public cloud adoption is growing while private cloud adoption flattened and fewer companies are prioritizing building a private cloud,” according to a blog post accompanying the report. “This was a change from last year’s survey, where we saw strong gains in private cloud use.”

Specifically, 85% of respondents reported having a multi-cloud, hybrid strategy, and that’s up from the 82% who reported a similar approach in 2016. At the same time, private cloud adoption dropped from 77% in 2016 to 72% in 2017.

In the survey, 41% of respondents reported running workloads in public clouds, while 38% said they run workloads in private clouds. In large enterprises, those numbers reverse, with 32% of respondents running workloads in public clouds, and 43% running workloads within private infrastructures.

“It’s important to note that the workloads running in private cloud may include workloads running in existing virtualized environments or bare-metal environments that have been ‘cloudified,’ ” according to the report.

When it comes to adopting cloud technologies and services, there are less barriers and concerns this year compared to 2016. The lack of resources and expertise to implement a cloud strategy was still the top concern.

In addition the report notes that in every cloud expertise level the Top 5 Challenges” indicate there is a substantial concern with “managing costs”.  One vehicle that can help manage costs is to apply data reduction technologies to your cloud deployment. Permabit VDO can be applied to public and/or private clouds quickly and easily enabling cost reduction of 50% or more in on-premise, in-transit and public cloud deployments.

Read more

curata__UGjgu400PJOAfFP.png

Future Software-Defined Datacenters Defined by Abstraction and Hardware Commoditization

| informationweek.com
curata__UGjgu400PJOAfFP.png

The emergence of agile digital business has changed the way we interact with technology and services, and defined new ways of building datacenters and converged infrastructures. The “as-a-service” concept has also been implemented in virtualized infrastructures to boost automation and flexibility without hampering performance or adding to costs.

Software-defined datacenters (SDDC) are the newest model for building, managing and operating large pools of physical resources without worrying about interoperability between hardware vendors or even hypervisors. Abstraction is key to hyperconverged infrastructures as it allows software to simplify operations and manage complex infrastructures.

Converged vs. Hyperconverged Infrastructures

Converged infrastructures (CI) allowed for computing, storage, networking and virtualization to be built into a single chassis, and hyperconverged infrastructures (HCI) builds on top of that by tightening the interaction between all these components with an extra software-defined layer. However, converged infrastructures don’t usually allow much flexibility in configuration, as the purchased hardware is usually vendor-dependent and additional components are normally managed separately.

Hyperconverged infrastructures (HCI) are built to be hardware-agnostic and focused more on building on top of converged infrastructures by adding more components, such as data deduplication, WAN optimization and inline compression. The ability to manage the entire infrastructure through a single system and common toolset enables infrastructure expansion through simple point-and-click actions and checkboxes.

Separating physical hardware from infrastructure operations means that workloads and applications can work together more tightly than in legacy or converged infrastructures. At the same time, having a storage controller that acts as a service running on a node means that directly attaching data storage to physical machines is no longer necessary — any new storage will be part of a cluster and configured as part of a single storage pool.

Software-Defined Datacenters

While most of today’s organizations are probably not ready to adopt software-defined datacenters – and those that do probably fit into the visionary category – IT decision makers need to understand the business cases, use cases and risks associated with SDDSs. Because hyperconvergence is the actual definition of a software-defined datacenter, IT decision makers should proceed with caution when implementing it as they need to make sure that it delivers the best results for their business.

Gartner predicted that SDDCs will be the future of digital business, with 75 percent of top enterprises considering it mandatory by 2020. We’ve already seen hybrid cloud adoption increase through the integration of software and commodity datacenter hardware offered by public cloud vendors. The rise of SDDCs will probably also be fueled by the need for businesses to become more agile in terms of IT solutions that satisfy business growth and continuity.

Read more

curata__sgU8qd9uQLE3KYJ.jpeg

Why 2017 will belong to open source

| CIO News
curata__sgU8qd9uQLE3KYJ.jpeg

A few years ago, open source was the less-glamorous and low-cost alternative in the enterprise world, and no one would have taken the trouble to predict what its future could look like. Fast-forward to 2016, many of us will be amazed by how open source has become the de facto standard for nearly everything inside an enterprise. Open source today is the primary engine for innovation and business transformation. Cost is probably the last reason for an organisation to go in for open source.

An exclusive market study conducted by North Bridge and Black Duck brought some fascinating statistics a few months ago. In the study titled “Future of Open Source”, about 90% of surveyed organisations said that open source improves efficiency, interoperability and innovation. What is even more significant is the finding that the adoption of open source for production environments outpaced the proprietary software for the first time – more than 55% leverage OSS for production infrastructure.

OpenStack will rule the cloud world
OpenStack has already made its presence felt as an enterprise-class framework for the cloud. An independent study, commissioned by SUSE, reveals that 81 percent of senior IT professionals are planning to move or are already moving to OpenStack Private Cloud. What is more, the most innovative businesses and many Fortune 100 businesses have already adopted OpenStack for their production environment.

As cloud becomes the foundation on which your future business will be run, OpenStack gains the upper hand with its flexibility, agility, performance and efficiency. Significant cost reduction is another major consideration for organisations, especially the large enterprises. Because a proprietary cloud platform is excessively expensive to build and maintain and operations of Open Stack deliver baseline cost reductions. In addition data reduction in an Open Stack deployment can further reduce operating costs.

Open source to be at the core of digital transformation
Digital transformation is, in fact, one of the biggest headaches for CIOs because of its sheer heterogeneous and all-pervading nature. With the data at the center of digital transformation, it is often impossible for CIOs to ensure that the information that percolates down is insightful and secure at the same time. They need a platform which is scalable, flexible, allows innovations and is quick enough to turn around. This is exactly what Open Source promises. Not just that, with the current heterogeneous environments that exist in enterprises, interoperability is going to be the most critical factor.

Technologies like Internet of Things (IoT) and SMAC (social, mobile, analytics and cloud) will make data more valuable and voluminous. The diversity of devices and standards that will emerge will make open source a great fit for enterprises to truly leverage these trends. It is surprising to know that almost all ‘digital enterprises’ in the world are already using open source platforms and tools to a great extent. The pace of innovation that open source communities can bring to the table is unprecedented.

Open source-defined data centers
A recent research paper from IDC states that 85 percent of the surveyed enterprises globally consider open source to be the realistic or preferred solution for migrating to software-defined infrastructure (SDI). IDC also recommends to avoiding vendor lock-in by deploying open source solutions. Interestingly, many organisations seem to have already understood the benefits of open source clearly, with Linux adoption in the data centers growing steadily at a pace of 15-20%.

The key drivers of SDI – efficiency, scalability and reliability at minimal investment – can be achieved only with the adoption of open source platforms. Open source helps the enterprises to be agiler in building, deploying and maintaining applications. In the coming days, open source adoption is going to be essential for achieving true ‘zero-downtime’ in Software-Defined-Infrastructure.

The open source will have specifically large role to play in the software-defined-storage (SDS) space. It will help organisations in overcoming the current challenges associated with SDS. Open SDS solutions can scale infinitely without a need to refresh the entire platform or disrupt the existing functioning environment.

Data Reduction will easily be added to SDS or OS environments with Permabit VDO. A simple plug and play approach that will enable 2X or more storage reduction will add to the already efficient operations of open source deployments.

Open source to be decisive in enterprise DevOps journey
Today, software and applications have a direct impact on business success and performance. As a reason, development, testing, delivery, and maintenance of applications have become very crucial. In the customer-driven economy, it is imperative for organisations to have DevOps and containerisation technologies to increase release cycles and quality of applications.

Often, enterprises struggle to get the most out of DevOps model. The investment associated with replicating the production environments for testing the apps is not negligible. They also fail to ensure that the existing systems are not disturbed while running a testing environment within containers.

Industry analysts believe that microservices running in Docker-like containers, on an open and scalable cloud infrastructure are the future of applications. OpenStack-based cloud infrastructures are going to be an absolute necessity for enterprises for a successful DevOp journey. The flexibility and interoperability apart, the open cloud allows the DevOps team to reuse the same infrastructure as and when containers are created.

In 2017, it is expected to see open source becoming the first preference for organisations that are at the forefront of innovation.

Read more

curata__W1e4Dlq7mDO3w4q.png

Cloud IT Spending to Edge Out Traditional Data Centers by 2020

| Datamation
curata__W1e4Dlq7mDO3w4q.png

The IT solutions market for cloud providers has nowhere to go but up.

A new forecast from IDC predicts that cloud IT infrastructure spending on servers, storage and network switches will jump 18.2 percent this year to reach $44.2 billion. Public clouds will generate 61 percent of that amount and off-premises private clouds will account for nearly 15 percent.

IDC research director Natalya Yezhkova, said that over the next few quarters, “growth in spending on cloud IT infrastructure will be driven by investments done by new hyperscale data centers opening across the globe and increasing activity of tier-two and regional service providers,” in a statement.

Additionally, businesses are also growing more adept at floating their own private clouds, she said. “Another significant boost to overall spending on cloud IT infrastructure will be coming from on-premises private cloud deployments as end users continue gaining knowledge and experience in setting up and managing cloud IT within their own data centers.”

Despite a 3 percent decline in spending on non-cloud IT infrastructure during 2017, the segment will still command the majority (57 percent) of all revenues. By 2020, however, the tables will turn.

Combined, the public and private data center infrastructure segments will reach a major tipping point in 2020, accounting for nearly 53 percent of the market, compared to just over 47 percent for traditional data center gear. Public cloud operators and private cloud environments will drive $48.1 billion in IT infrastructure sales by that year.

Indeed, the cloud computing market is growing by leaps and bounds.

The shifting sands are both predictable and evolutionary. Dominant data center spending has been platform specific and somewhat captive. As public cloud providers demonstrated, efficient data center operations are being deployed with white box platforms and high performance open -source software stacks that minimize costs and eliminate software bloat.  Corporate IT professionals didn’t miss this evolution and have begun developing similar IT infrastructures. They are sourcing white box platform’s which are much less costly than branded platforms and combining them with open-source software including operating systems, software defined storage with data reduction that drives down storage consumption too.   The result is a more efficient data center with less costly hardware and open-source software that drives down acquisition and operating costs.

The shift is occurring and the equilibrium between public and private clouds will change. Not just because of hardware but increasingly because of open-source software and the economic impact it has on building high density data centers that run more efficiently  than the branded platforms.

Read more