Deploying VDO Data Reduction on Red Hat Atomic Host

| By: (56)

All of the buzz about containers is a bit surprising to many people who’ve watched operating system technology evolve over the years.  After all, many of the core concepts behind running isolated applications on a shared OS has been around on UNIX for over 20 years.  So what’s so exciting?  Well, to understand the container revolution you first have to look at Virtual Machines (VMs) and their impact on the…

Read more

curata__WlvD3H3qEzyulwo.png

Data Center Optimization: How to Do More Without More Money

| Data Center Knowledge
curata__WlvD3H3qEzyulwo.png

Data centers are pushing the boundaries of the possible, using new paradigms to operate efficiently in an environment that continually demands more power, more storage, more compute capacity… more everything. Operating efficiently and effectively in the land of “more” without more money requires increased data center optimization at all levels, including hardware and software, and even policies and procedures.

Although cloud computing, virtualization and hosted data centers are popular, most organizations still have at least part of their compute capacity in-house. According to a 451 Research survey of 1,200 IT professionals, 83 percent of North American enterprises maintain their own data centers. Only 17 percent have moved all IT operations to the cloud, and 49 percent use a hybrid model that integrates cloud or colocation hosts into their data center operations.

The same study says most data center budgets have remained stable, although the heavily regulated healthcare and finance sectors are increasing funding throughout data center operations. Among enterprises with growing budgets, most are investing in upgrades or retrofits to enable data center optimization and to support increased density.

As server density increases and the data center footprint shrinks, any gains may be taken up by the additional air handling and power equipment, including uninterruptable power supplies and power generators. In fact, data center energy usage is expected to increase by 81 percent by 2020, according to CIO magazine.

Often, identifying and decommissioning unused servers during a data center optimization project is a challenge, along with right-sizing provisioning.

Virtualization makes it easy to spin up resources as needed, but it also makes tracking those resources harder. The result is that unused servers may be running because no one is certain they’re not being used. A study by the Natural Resources Defense Council and Anthesis reports that up to 30 percent of servers are unused, but still running.

A similar principle extends to storage. While data deduplication (removing duplicate files) is widely used, over-crowded storage remains an issue for small to medium-sized enterprises (SMEs). Deduplication can free much-needed storage space.  For example, data deduplication along with compression can shrink data storage consumption by up to 85%.  This not only addresses the budget issues mentioned above but also helps with data density much like the server density mentioned earlier.  Imagine that you can save money with less storage and increase your data density at the same time .  Looks lie a win-win!

If data center optimization is concerned with saving money, managers also should examine their purchasing programs. NaviSite looked for cost efficiencies within volume projects and looked at large commodity items like cabinets, racks, cabling and plug strips eliminated middlemen whenever possible. For big purchases go directly to the manufacturers in China and seek innovative young technology vendors working with them to design specifications that significantly lower the price.

Data center optimization, clearly, extends beyond hardware to become a system-wide activity. It is the key to providing more power, more capacity and more storage without requiring more money.

* This article is quite long you may want to read the source article which can be found by clicking on the link below:

 

Read more

curata__JGRWVROnio08LCz.jpeg

Global Cloud Storage is grow at a CAGR of 25% by Forecast to 2023

| biotech.einnews.com
curata__JGRWVROnio08LCz.jpeg

In this rapidly changing world of technology, cloud storage market is gaining immense popularity owing to its ability to easily integrate with the already existing infrastructure of the enterprise. Cloud storage gateway solution provides features like encryption and data reduction technology with compression and data deduplication adds cost reduction and security to the data. It also allows rapid transfer of data to the cloud since the data is reduced and network traffic minimized.

As compared to other regions, the cloud storage market in North America is expected to witness a significantly healthy growth is accounted for the highest market share throughout the forecast period. U.S and Canada are anticipated to drive the growth owing to the presence of large number of established players of cloud storage solutions in the cloud storage market. In addition to this, the region also has a well-established infrastructure and higher internet penetration. Moreover, increasing adoption of cloud storage by small and medium enterprises is expected to be a major factor for the growth of cloud storage market.

The Cloud Storage Market is growing rapidly over 25% of CAGR and is expected to reach at approx. USD 104 billion by the end of forecast period.

Read more

curata__06169c239980d6fe0b7aaddfbfa4ef2a.PNG

Application consistency for enterprise multi-cloud and data reduction

| Networking information, news and tips
curata__06169c239980d6fe0b7aaddfbfa4ef2a.PNG

 

The cloud era has arrived in a big way as businesses of all sizes are looking to increase their level of IT agility. But when it comes to cloud, one size certainly does not fit all. Businesses have a wide range of options, including the use of a private cloud and a wide range of public cloud providers. My research finds that 82% of businesses will operate a hybrid cloud environment in the next five years.

This is consistent with the findings of F5’s State of Application Delivery in 2017 report, which the company released the results of last week at its annual EMEA Agility Conference in Barcelona. The study polled approximately 2,220 customers across the globe about their plans for the cloud and the challenges they face. Some interesting statistics from the survey that will affect a multi-cloud strategy are as follows:

  • 80% are committed to multi-cloud architectures
  • 20% will have more than half of their applications running in a public and/or private cloud this year
  • 34% of organizations lack the skills necessary to secure the cloud
  • 23% lack other skills specific to cloud
  • Organizations will deploy an average of 14 application services necessary to optimize and secure cloud services, with the top five being network firewall, antivirus SSL VPN, load balancing and spam mitigation

In addition to the above challenges, the cost of multi-cloud deployment when using it for data replication can become excessive as can the network costs. The addition of data reduction technology to the IT stack can mitigate these costs by as much as 85%. See Data Reduction Reduces the Cost of Cloud Deployment for more specifics.

Read more

curata__72Bb2tTnqExjWmD.jpeg

Data Reduction Technologies Reduce the Costs of Cloud Deployment

| Stock Market
curata__72Bb2tTnqExjWmD.jpeg

Enterprise IT organizations use cloud architectures to rapidly deploy resources and lower costs. By incorporating data reduction technologies in their architectures, organizations will accelerate deployment and reduce IT expenditures, say experts at Permabit Technology Corporation .

Data reduction is ideal for use by today’s enterprises choosing cloud-based deployments. With data reduction, organizations increase their agility and reduce costs since the technology reduces the footprint of data in transit and at rest. When data reduction is deployed at the operating system level, it is applicable to use in public cloud services or deploy in a company’s own private cloud.

“Organizations are under pressure to deliver digital transformation while reducing IT costs and are looking more and more to cloud as an answer,” said Tom Cook, Permabit CEO. “Our Virtual Data Optimizer (VDO) is the best and easiest way to deploy data reduction in every cloud deployment model.”

Permabit VDO provides the three key data reduction technologies needed to maximize storage savings, including: thin provisioning, data deduplication and compression. Implemented as a driver for the Linux device mapper, VDO operates at the same level in the Linux kernel as core data management services such as virtualization, data protection and encryption. VDO data reduction “just-works” regardless of whether the storage layers above are providing object-, block-, compute- or file-based access.

 

Read more

curata__iqhEaW8lW3S1SC7.png

Cloud Economics Drive the IT Infrastructure of Tomorrow

| CloudPost
curata__iqhEaW8lW3S1SC7.png

The cloud continues to dominate IT as businesses make their infrastructure decisions based on cost and agility. Public cloud, where shared infrastructure is paid for and utilized only when needed, is the most popular model today. However, more and more organizations are addressing security concerns by creating their own private clouds. As businesses deploy private cloud infrastructure, they are adopting techniques used in the public cloud to control costs. Gone are the traditional arrays and network switches of the past, replaced with software-defined data centers running on industry standard servers.

Features which improve efficiency make the cloud model more effective by reducing costs and increasing data transfer speeds. One such feature which is particularly effective in cloud environments is inline data reduction. This is a technology that can be used to lower the costs of data in transit and at rest. In fact, data reduction delivers unique benefits to each model of cloud deployment.

For the entire article please click on the link below;

Read more

Permabit VDO on a Linux Laptop – Great Performance and 5:1 Space Savings

| By: (56)

  I get asked about VDO performance all the time and I’ve written several posts about big systems where we’ve seen spectacular performance numbers including 8 GB/s throughput and 650,000 mixed random 4 KB IOPS.  But what about performance on smaller systems for developers?  How about a laptop? A couple weeks ago I installed VDO version 6 on my Lenovo X230 laptop running Red Hat Enterprise Linux 7.3 and here’s a…

Read more

curata__auL8pOUjcauREKk.jpeg

Is the storage array on the endangered species list?

| By: (61)

A high-stakes game is playing out today as Amazon, Google, and Microsoft compete for leadership in cloud services markets that are projected to total in the hundreds of billions of dollars by 2020. In the last quarter alone, they spent a combined $9B to build out data centers to support the exploding cloud market (WSJ, 4/7/17). There is little question whether this triumvirate will be successful in their cloud efforts…

Read more

DR Journal

Cloud Economics drive the IT Infrastructure of Tomorrow

| Welcome to Disaster Recovery Journal
DR Journal

The cloud continues to dominate IT as businesses make their infrastructure decisions based on cost and agility. Public cloud, where shared infrastructure is paid for and utilized only when needed, is the most popular model today. However, more and more organizations are addressing security concerns by creating their own private clouds. As businesses deploy private cloud infrastructure, they are adopting techniques used in the public cloud to control costs. Gone are the traditional arrays and network switches of the past, replaced with software-defined data centers running on industry standard servers.

Efficiency features make the cloud model more effective by reducing costs and increasing data transfer speeds. One such feature, which is particularly effective in cloud environments is inline data reduction. This is a technology that can be used to lower the costs of data in flight and at rest. In fact, data reduction delivers unique benefits to each of the cloud deployment models.

Public Clouds

The public cloud’s raison d’etre is its ability to deliver IT business agility, deployment flexibility and elasticity. As a result, new workloads are increasingly deployed in public clouds.  Worldwide public IT cloud service revenue in 2018 is predicted to be $127B.  

Data reduction technology minimizes public cloud costs. For example, deduplication and compression typically cut capacity requirements of block storage in enterprise public cloud deployments by up to 6:1.  These savings are realized in reduced storage consumption and operating costs in public cloud deployments.   

Consider AWS costs employing data reduction;

If you provision a 300 TB of EBS General Purpose SSD (gp2) storage for 12 hours per day over a 30 day month in a region that charges $0.10 per GB-month, you would be charged $15,000 for the storage.

With data reduction, that monthly cost of $15,000 would be reduced to $2,500.  Over a 12 month period you will save $150,000.   Capacity planning is a simpler problem when it is 1/6th its former size.  Bottom line, data reduction increases agility and reduces costs of public clouds.

One data reduction application that can readily be applied in public cloud is Permabit’s Virtual Disk Optimizer (VDO) which is a pre-packaged software solution that installs and deploys in minutes on Red Hat Enterprise Linux and Ubuntu LTS Linux distributions. To deploy VDO in Amazon AWS, the administrator provisions Elastic Block Storage (EBS) volumes, installs the VDO package into their VMs and applies VDO to the block devices represented for their EBS volumes.  Since VDO is implemented in the Linux device mapper, it is transparent to the applications installed above it.

As data is written out to block storage volumes, VDO applies three reduction techniques:

  1. Zero-block elimination uses pattern matching techniques to eliminate 4 KB zero blocks

  2. Inline Deduplication eliminates 4 KB duplicate blocks

  3. HIOPS Compression™ compresses remaining blocks 

cloud1

This approach results in remarkable 6:1 data reduction rates across a wide range of data sets. 

Private Cloud

Organizations see similar benefits when they deploy data reduction in their private cloud environments. Private cloud deployments are selected over public because they offer the increased flexibility of the public cloud model but keep privacy and security under their own control. IDCpredicts in 2017 $17.2B in infrastructure spending for private cloud, including on-premises and hosted private clouds.

One problem that data reduction addresses for the private cloud is that, when implementing private cloud, organizations can get hit with the double whammy of hardware infrastructure costs plus annual software licensing costs. For example, Software Defined Storage (SDS) solutions are typically licensed by capacity and their costs are directly proportional to hardware infrastructure storage expenses. Data reduction decreases storage costs because it reduces storage capacity consumption. For example, deduplication and compression typically cut capacity requirements of block storage in enterprise deployments by up to 6:1 or approximately 85%.

Consider a private cloud configuration with a 1 PB deployment of storage infrastructure and SDS. Assuming a current hardware cost of $500 per TB for commodity server-based storage infrastructure with datacenter-class SSDs and a cost of $56,000 per 512 TB for the SDS component, users would pay $612,000 in the first year. In addition, software subscriptions are annual, over three years you will spend $836,000 for 1 PB of storage and over five years, $1,060,000.

The same configuration with 6:1 data reduction in comparison over five years will cost $176,667 for hardware and software resulting in $883,333 in savings. And that’s not including the additional substantial savings in power cooling and space. As businesses develop private cloud deployments, they must be sure it has data reduction capabilities because the cost savings are compelling.

When implementing private cloud on Linux, the easiest way to include data reduction is with Permabit Virtual Data Optimizer (VDO). VDO operates in the Linux kernel as one of many core data management services and is a device mapper target driver transparent to persistent and ephemeral storage services whether the storage layers above are providing object, block, compute, or file based access.

VDO – Seamless and Transparent Data Reduction

cloud2

The same transparency applies to the applications running above the storage service level. Customers using VDO today realize savings up to 6:1 across a wide range of use cases.

Some workflows that benefit heavily from data reduction are;

  • Logging: messaging, events, system and application logs

  • Monitoring: alerting, and tracing systems

  • Database: databases with textual content, NOSQL approaches such as MongoDB and Hadoop

  • User Data: home directories, development build environments

  • Virtualization and containers: virtual server, VDI, and container system image storage

  • Live system backups: used for rapid disaster recovery

With data reduction, cumulative cost savings can be achieved across a wide range of use cases which makes data reduction so attractive for private cloud deployments.

Reducing Hybrid Cloud’s Highly Redundant Data

Storage is at the foundation of cloud services and almost universally data in the cloud must be replicated for data safety. Hybrid cloud architectures that combine on-premise resources (private cloud) with colocation, private and multiple public clouds result in highly redundant data environments. IDC’s FutureScape report finds “Over 80% of enterprise IT organizations will commit to hybrid cloud architectures, encompassing multiple public cloud services, as well as private clouds by the end of 2017.” (IDC 259840)

Depending on a single cloud storage provider for storage services can risk SLA targets. Consider the widespread AWS S3 storage errors that occurred on February 28th 2017, where data was not available to clients for several hours. Because of loss of data access businesses may have lost millions of dollars of revenue. As a result today more enterprises are pursuing a “Cloud of Clouds” approach where data is redundantly distributed across multiple clouds for data safety and accessibility. But unfortunately, because of the data redundancy, this approach increases storage capacity consumption and cost.

That’s where data reduction comes in. In hybrid cloud deployments where data is replicated to the participating clouds, data reduction multiplies capacity and cost savings. If 3 copies of the data are kept in 3 different clouds, 3 times as much is saved. Take the private cloud example above where data reduction drove down the costs of a 1 PB deployment to $176,667, resulting in $883,333 in savings over five years. If that PB is replicated in 3 different clouds, the savings would be multiplied by 3 for a total savings of $2,649,999.

Permabit’s Virtual Data Optimizer (VDO) provides the perfect solution to address the multi-site storage capacity and bandwidth challenges faced in hybrid cloud environments. Its advanced data reduction capabilities have the same impact on bandwidth consumption as they do on storage and translates to a 6X reduction in network bandwidth consumption and associated cost.  Because VDO operates at the device level, it can sit above block-level replication products to optimize data before data is written out and replicated.

Summary

IT professionals are finding that the future of IT infrastructure lies in the cloud. Data reduction technologies enable clouds – public, private and hybrid to deliver on their promise of safety, agility and elasticity at the lowest possible cost making cloud the deployment model of choice for IT infrastructure going forward.”

Read more

curata__1MK8EV7S10cRfWt.png

Cloud Economics drive the IT Infrastructure of Tomorrow

| ITBusinessNet.com
curata__1MK8EV7S10cRfWt.png

Cloud Economics drive the IT Infrastructure of Tomorrow

The cloud continues to dominate IT as businesses make their infrastructure decisions based on cost and agility. Public cloud, where shared infrastructure is paid for and utilized only when needed, is the most popular model today. However, more and more organizations are addressing security concerns by creating their own private clouds. As businesses deploy private cloud infrastructure they are adopting techniques used in the public cloud to control costs. Gone are the traditional arrays and network switches of the past, replaced with software-defined data centers running on industry standard servers.

Efficiency features make the cloud model more effective by reducing costs and increasing data transfer speeds. One such feature, which is particularly effective in cloud environments, is inline data reduction. This is a technology that can be used to lower the costs of data in flight and at rest. In fact, data reduction delivers unique benefits to each of the cloud deployment models.

Public Clouds

The public cloud’s raison d’etre is its ability to deliver IT business agility, deployment flexibility and elasticity. As a result new workloads are increasingly deployed in public clouds.  Worldwide public IT cloud service revenue in 2018 is predicted to be $127B.

Data reduction technology minimizes public cloud costs. For example, deduplication and compression typically cut capacity requirements of block storage in enterprise public cloud deployments by up to 6:1.  These savings are realized in reduced storage consumption and operating costs in public cloud deployments.

Consider AWS costs employing data reduction;

If you provision a 300TB of EBS General Purpose SSD (gp2) storage for 12 hours per day over a 30 day month in a region that charges $0.10 per GB-month, you would be charged $15,000 for the storage.

With data reduction, that monthly cost of $15,000 would be reduced to $2,500.  Over a 12 month period you will save $150,000.   Capacity planning is a simpler problem when it is 1/6th its former size.  Bottom line, data reduction increases agility and reduces costs of public clouds.

One data reduction application that can readily be applied in public cloud is Permabit’s Virtual Disk Optimizer (VDO) which is a pre-packaged software solution that installs and deploys in minutes on Red Hat Enterprise Linux and Ubuntu LTS Linux distributions.  To deploy VDO in Amazon AWS, the administrator provisions Elastic Block Storage (EBS) volumes, installs the VDO package into their VMs and applies VDO to the block devices represented for their EBS volumes.  Since VDO is implemented in the Linux device mapper, it is transparent to the applications installed above it.

To READ the complete article;

CLICK ON THE LINK BELOW

Read more