Data Efficiency in the News


SUSE acquires OpenStack IaaS and Cloud Foundry PaaS assets from HPE

| Geekzone

SUSE has entered into an agreement with Hewlett Packard Enterprise (HPE) to acquire technology and talent that will expand SUSE’s OpenStack Infrastructure-as-a-Service (IaaS) solution and accelerate SUSE’s entry into the growing Cloud Foundry Platform-as-a-Service (PaaS) market.

The acquired OpenStack assets will be integrated into SUSE OpenStack Cloud, and the acquired Cloud Foundry and PaaS assets will enable SUSE to bring to market a certified, enterprise-ready SUSE Cloud Foundry PaaS solution for all customers and partners in the SUSE ecosystem. The agreement includes HPE naming SUSE as its preferred open source partner for Linux, OpenStack and Cloud Foundry solutions. In addition, SUSE has increased engagement with the Cloud Foundry Foundation, becoming a platinum member and taking a seat on the Cloud Foundry Foundation board. 

“The driving force behind this acquisition is SUSE’s commitment to providing open source software-defined infrastructure technologies that deliver enterprise value for our customers and partners,” said Nils Brauckmann, CEO of SUSE. “This also demonstrates how we’re building our business through a combination of organic growth and technology acquisition. Once again, this strategy sends a strong message to the market and the worldwide open source community that SUSE is a company on the move.”

Ashish Nadkarni, program director, Computing Platforms, for IDC, said, “This expanded partnership and transfer of technology assets has the potential to be a real win-win for SUSE and HPE, as well as customers of both companies. SUSE has proven time and again it can successfully work with its technology partners to help organizations glean maximum benefit from their investments in open source. SUSE is positioning itself very well as a high-growth company with the resources it needs to compete in key market segments.”

As part of the transaction, HPE has named SUSE as its preferred open source partner for Linux, OpenStack IaaS and Cloud Foundry PaaS. HPE’s choice of SUSE as its preferred open source partner further cements SUSE’s reputation for delivering high-quality, enterprise-grade open source solutions and services.

Abby Kearns, executive director of the Cloud Foundry Foundation, said, “SUSE has been a powerful player in the enterprise open source world for more than two decades, and I’m excited to see the impact that a SUSE Cloud Foundry distribution will have for enterprises and developers around the world. SUSE’s strategic vision for the convergence of Platform-as-a-Service and Container-as-a-Service technologies will also be a welcome addition to the strategic dialogue we have within the Cloud Foundry Foundation community.”

Read more


HPE core servers and storage under pressure

| The Register

HPE’s latest results show a company emerging slimmer and fitter through diet (cost-cutting) and exercise (spin-merger deals) but facing tougher markets in servers and storage – the new normal, as CEO Meg Whitman says.

A look at the numbers and the earnings call from the servers and storage points of view shows a company with work to do.

The server business saw revenue of $3.5bn in the quarter, down 7 per cent year-on-year and up 5 per cent quarter-on-quarter. High-performance compute (Apollo) and SGI servers did well. Hyper-converged is growing and has more margin than the core ISS (Industry Standard Servers). Synergy and mission critical systems also did well.

But the servers business was affected by strong pressure on the core ISS ProLiant racks, a little in the blade server business, and also low or no profitability selling Cloudline servers, the ones for cloud service providers and hyperscale customers.

In the earnings call, Meg Whitman discussed the ISS business, saying: “Other parts of the server business are doing really well. And I think that core ISS rack deterioration has a number of different things. One is in part our execution in the channel and pricing and things like that. And the second is the move to the public cloud.”

She also mentioned that there was increased competition from Huawei in servers.

Her answer is: “We need to shore up core ISS racks with improvements in the channel, improvements in quote to cash, and focus – more focus on the distributors and VARs for the volume-related ISS rack business.”

She thinks the ISS business can grow 1-2 per cent if this is done and because profitable gear like storage gets attached to these servers, HPE is “gaining share profits in this business”.


Although HPE’s CEO said hyper-converged was doing well, there is some way to go. Gartner ranks HPE as the leader in the hyper-converged and integrated systems magic quadrant, with EMC second and Nutanix third.

The analysis company’s researchers said: “Hewlett Packard Enterprise offers multiple converged, hyper-converged, reference architectures and point systems of various design points. But as the volume market leader in many segments (including blade and rack servers), it is only logical that HPE should be a leading vendor in this market.”

As Nutanix is just a hyper-converged player then HPE is not a leader in hyper-converged systems with its HC 380. The Gartnerites point out that “HPE is a relative late starter in HCIS and is frequently absent from competitive hyper-convergence evaluations versus more established vendors.”

An August Forester Wave report on hyper-converged systems put HPE in eighth position. Forester’s researchers said: “HPE’s product is in its early stages, and… its position in the HCI segment should improve quickly over time.”

Nothing was said in the call about any merger or acquisition in this area. There have been rumours about HPE and SimpliVity getting together.


In the all-flash array (AFA) business, HPE grew 3PAR AFA revenues 100 per cent year-on-year to a $750m annual run rate, which compares with NetApp at $1bn and Pure at $631m. Our sense is that Dell-EMC leads this market, followed by NetApp, then HPE, with Pure in fourth place.

Whitman said: “All-flash now makes up 50 per cent of our 3PAR portfolio and interestingly still only comprises 10 per cent of the data centre. So we see more running room in our all-flash business. And… we’re introducing new deduplication technology that should provide some further uplift in all-flash array, because there has been a gap in our portfolio.”

Comparing HPE to other AFA suppliers we see Dell EMC with five AFA products: XtremIO, DSSD, all-flash VMAX and Unity, and an all-flash Isilon product. NetApp has three: EF series, SolidFire, and all-flash FAS. Pure has its FlashArray and is developing FlashBlade. HPE has the single all-flash 3PAR product. This looks to be insufficient to cover the developing AFA use cases such as high-speed analytics, scale-out cloud service provision and file access.

There is no sense from HPE that it recognises this as a problem area. We see here a reflection of a view that HPE has a proliferation of server products and a relative scarcity of successful storage products. Historically in HPE, server and storage business units have followed separate paths. With them both inside Antonio Neri’s Enterprise Systems organisation, any such separateness should diminish.


Read more


Hewlett-Packard revenues shrink, just like the company

| MarketWatch

The separation that split Hewlett-Packard Co. into two smaller companies a year ago has done nothing to turn either company into the nimbler, faster growing entities Meg Whitman hoped for.

That was clear on Tuesday, when both Hewlett Packard Enterprise Co. HPE, +1.76% and HP Inc. HPQ, +1.13%  reported their fiscal fourth-quarter results. HPE, focused on the corporate computing market selling servers, networking equipment and services, reported a 7% drop in revenue in the fourth quarter, and annual revenue of $50.1 billion was down 4% from fiscal 2015, adjusted for the split. HPE claimed that revenue was up 2% year-over-year, when adjusted for divestitures and currency.

The PC and printing focused company fared slightly better in the fourth quarter, with 2% growth in revenue, based on strong sales of some of the company’s new PC products, but overall for fiscal 2016, revenue fell 6% to $48.2 billion.

So both companies are suffering from shrinking revenue while undergoing massive layoffs and stock buybacks that have placated investors, but done little to actually strengthen the companies while forcing restructuring charges. Still, with the strong stock performance of the past year on their side, HP leaders spoke optimistically of their companies.

That course is to get even smaller, in the case of HP Enterprise, with plans announced earlier this year to spin/merge both its services and its software business with CSC and MicroFocus respectively. Again, Whitman is promising that these deals will enable HP Enterprise to be “more nimble, provide cutting-edge solutions, play in higher-growth markets, and have an enhanced financial profile” sometime in 2017, when the deals are both complete.

But both HPs are still in too many legacy, slower-growing businesses to believe growth is around the corner. HP Inc.’s year-over-year 4% gain in overall PC sales, due to more competitive PCs, was overshadowed by the printing business, where sales of supplies (including printer ink) dropped 12%. HP executives pointed out that this decline had abated from a steeper drop of 18% in supplies in the previous third quarter, but slower shrinkage didn’t satisfy investors, and shares were down 2.3% in after-hours trading.

HP Enterprise is still burdened with a legacy server business that declined 6%, even as its high-performance computing business is growing with help on the way from the acquisition of SGI Corp. Whitman has bet on companies eventually moving to hybrid cloud structures, a mixture of on-premises and remote servers, but she admitted Tuesday that HPE is “definitely seeing impact” from potential clients choosing public cloud-computing environments.

If HP Inc. can’t get both its legacy consumer businesses to grow at the same time, and HP Enterprise can’t convince companies to at least partly eschew public cloud services like Amazon Web Services AMZN, +0.10% growth will be nearly impossible for either company. That only leaves Whitman and Weisler with more spinouts, divestitures, layoffs and stock buybacks to distract from their shrinking revenue.

Read more


Microsoft embraces open source in the cloud and on-premises


Microsoft has offered multiple flavors of Linux on its Azure cloud public cloud platform and infrastructure for several years now.

“Microsoft loves Linux,” Microsoft CEO Satya Nadella said during the 2014 announcement of new Azure services. “Twenty percent of Azure is already Linux. We will always have first-class support for Linux [distributions].”

Microsoft took that love another step last week. In a move that would have been stunning more than a decade ago, it joined The Linux Foundation — which sponsors the work of Linux creator Linus Torvalds and plays a central role in the promotion of open source software — as a platinum sponsor.

“As a cloud platform company, we aim to help developers achieve more using the platforms and languages they know,” Scott Guthrie, executive vice president, Microsoft Cloud and Enterprise Group, said in a statement last week. “The Linux Foundation is home not only to Linux, but many of the community’s most innovative open source projects. We are excited to join The Linux Foundation and partner with the community to help developers capitalize on the shift to intelligent cloud and mobile experiences.”

The reason, Microsoft’s Kumar says, is simple: In the messy, real world of enterprise IT, hybrid shops are the norm and customers don’t need or want vendors to force their hands when it comes to operating systems. Serving these customers means giving them flexibility.

That philosophy has spread from Microsoft’s cloud business to its on-premises infrastructure business as the company seeks to make support for hybrid environments a key differentiator of its cloud and on-premise offerings (an idea Nadella pushed as Microsoft’s executive vice president of Cloud and Enterprise before his ascension to CEO). Last week, Joseph Sirosh, corporate vice president of the Data Group at Microsoft, announced that the next release of SQL Server would, for the first time, support Linux.

Now you can also develop applications with SQL Server on Linux, Docker or macOS (via Docker) and then deploy to Linux, Windows, Docker, on-premises or in the cloud,” Sirosh wrote in a blog post. “This represents a major step in our journey to making SQL Server the platform of choice across operating systems, development languages, data types, on-premises and in the cloud.”

Kumar adds that customers tell Microsoft, “I want to use SQL and don’t care about what’s underneath it. I don’t want to worry about it, I just want to know that whenever I want to install SQL, I have the choice to do that.”

All major features of the SQL Server relational database engine are coming to Linux, Sirosh said, including advanced features such as in-memory online transactional processing (OLTP), in-memory columnstores, Transparent Data Encryption, Always Encrypted and Row-Level Security. There will be native Linux installations with familiar RPM and APT packages for Red Hat Enterprise Linux, Ubuntu Linux and SUSE Linux Enterprise Server. He noted that the public preview of the next release of SQL Server, in both Windows and Linux flavors, will be available on Azure Virtual Machines and as images on Docker Hub.

“I’m excited about Microsoft as a company truly embracing choice,” Kumar says. “We’re clearly seeing the base getting energized in a big way. People are giving us a chance again.”

Read more


Hybrid Cloud Storage Use to Double in Next 12 Months


The use of hybrid cloud storage will accelerate rapidly over the next 12 months, according to research published today by Cloudian, Inc., in cloud-compatible object storage systems.

Across 400 organisations surveyed in the UK and USA, 28% already use hybrid cloud storage, with a further 40% planning to implement within the next year. Only 19% have no plans to adopt.

Organisations are looking to hybrid cloud storage to support a variety of workloads.  backup is the most popular use case, with 64% of respondents reporting deployment or plans to deploy. Web infrastructure (52%), application dev/test (48%) and technical applications (43%) are also driving the adoption of hybrid cloud storage products and services.

The research reveals that larger organisations (2,500 employees or more) are adopting the approach most rapidly, with 82% planning to deploy in the next 12 months.

Decisions about whether to adopt hybrid cloud storage are being driven by multiple factors such as external and internal data governance rules. 59% of respondents report that not all of their data can go to the public cloud, and that more than half of their data must remain on site. Most commonly cited among the data types that must remain on premises are financial data and customer records. Reasons named most commonly are security, governance and compliance rules, driven by both internal policy and external regulation.

When considering a hybrid cloud storage strategy, concerns about interoperability between on-premises and public cloud storage (40%) are only exceeded by those around security (62%) and cost (55%). 76% of respondents moving to hybrid cloud storage have yet to decide which interface to adopt.


Read more


Storage Survey: Cloud Storage Use Up, But Integration Still Sought


Storage is one of the primary drivers of hybrid cloud adoption. According to a survey by object storage system provider, Cloudian, 68% of IT decision makers surveyed in early November have already adopted or plan to implement within a year a form of public cloud storage to complement their existing on-premises storage.

The share that have already done so was 28%, with another 40% planning to do so within the next 12 months. The most frequent driver is use of cloud storage for backup purposes — in use by 64% or planned to be pressed into use in the next 12 months. By moving backup data into the cloud, the IT manager gets, not just another copy of his data, but one in a data center different from his own, and thus insulated from a disaster or failure that might strike his own.

On the other hand, 13% said they planned to adopt cloud storage but in a timeframe more than 12 months out, while 19% said they had no plans to adopt a cloud storage system to function in a hybrid fashion with an on-premises system.

The survey concluded that organizations are using cloud storage for specific applications and purposes, such as file sharing, email, and data backup. “But they continue to struggle when it comes to more general purpose storage needs as it relates to cloud,” and only 18% are using an on-premises application that leverages cloud storage in some way, the report said.

In addition to data backup, respondents named other uses for a hybrid storage system: Web infrastructure was cited by 52%; application development and test, 48%; technical applications, 43%; media and entertainment, 21%; and medical data, 21%.

Security and cost concerns remain the primary brakes on adopting hybrid cloud storage systems at a faster pace. Sixty-two percent of respondents said security in the cloud was a top concern, with cost coming in second at 55%.

Fifty-nine percent said they had data that could not be migrated to the cloud now or in the future, and that data represented 47% of the total. It included corporate financial data, customer records, research data, email, file shares and other collaboration data.

Other concerns were management complexity of hybrid storage system, interoperability with other systems, and the skill sets needed to use cloud storage listed as well.

When it came to global thinking about on-premises plus cloud storage, Cloudian found most respondents hadn’t moved off the application by application approach to the cloud. There was no emerging consensus on what a hybrid system designed to work together for multiple purposes looks like, it acknowledged.

When it asked the respondents who used no cloud storage what interface they would want for a hybrid system, “the respondents simply did not know what they would choose.” That sample was the smallest of the respondent groupings, 25.

Read more


Rackspace Opens First Data Centre In Continental Europe

| Stock Market

Rackspace® has expanded its investment in continental Europe by announcing that it will open a new data centre in Germany. Rackspace’s decision to expand its operations in the region provides customers with a new option for managed IT infrastructure in the face of strict personal data protection laws across Germany, Austria and Switzerland (DACH) territory.

The facility is designed to serve customers who seek managed private clouds and hosting environments, with a focus on fully managed VMware environments.

Rackspace will be working with one of its long-standing partners to build out the infrastructure of the new operation, which is expected to be fully operational in mid-2017.

To address increasing customer demand for managed cloud services, the plan to open the new facility in Frankfurt follows the recent appointment of Alex Fuerst as the leader of Rackspace operations in the DACH region. Alex has hired an in-region team focused on delivering managed private clouds and hosting environments, as well as managed cloud services for customers seeking help with the complexity and cost of managing AWS and Azure. Globally, Rackspace engineers have more than 500 AWS certifications and proven experience with architecting, deploying, operating and optimising both AWS and Azure customer environments. With this team of managed cloud specialists, Rackspace is poised to serve customers in the region with its expertise and broad portfolio of services for multi-cloud environments.

“I am excited to be able to bring this new Rackspace data centre online to serve our fast-expanding German customer base,” said Fuerst, who joined Rackspace in September 2016 after working in IT leadership roles in the DACH region. “We’re experiencing strong demand from DACH-centric customers, as well as U.S. and EMEA-based multinationals who are looking for managed private clouds and hosting environments, along with managed cloud services and expertise for AWS and Azure in continental Europe. This data centre will strengthen our multi-cloud capabilities on the European continent and pave the way for us to achieve our goal of becoming the leading managed cloud provider in Germany, Switzerland and Austria, which is already our third largest international market.”

“With the opening of our data centre in Germany, we can provide the highest level of availability, security, performance and management, and also help our customers address data protection requirements by providing them with multi-cloud deployment options. As the demand for managed services increases in the German-speaking region, companies of all sizes in all verticals are embracing multi-cloud approaches to IT, so that each of their workloads runs on the platform where it can achieve the best performance and cost efficiency,” Fuerst continued. “More and more of those companies are turning to Rackspace expertise and support for their critical IT services and data.”

With the addition of the data centre in Frankfurt, Rackspace will operate 12 data centres worldwide, including in London, Hong Kong, Sydney, Dallas, Chicago and Ashburn (near Washington, D.C.).

Read more


Software-Defined Storage Market Projected to Reach 22.56 Billion USD by 2021

| Stock Market

North America is expected to lead the Software-Defined Storage market as the governments in the region have initiated many projects related to digitalization of their countries, which is making the region the largest adopter of SDS solutions.[167 Pages Report] Software defined Storage Market categorizes the Global SDS Market by solutions as software-defined server, data security & compliance, controller, data management, and hypervisor, by services, by usage, by organization size, by application area & by geography.

According to report “Software-Defined Storage Market by Component [Platforms/Solutions (Software-Defined Server, Data Security & Compliance, Controller, Data Management, and Hypervisor), Services], Usage, Organization Size, Application Area – Global Forecast to 2021″, global market is expected to grow from USD 4.72 Billion in 2016 to USD 22.56 Billion by 2021, at a Compound Annual Growth Rate (CAGR) of 36.7%.

Exponentially growing data volume across enterprises, rise in “software defined” concept, and the need for cost optimization in data management are some of the major driving factors for the SDS market. Furthermore, avoiding downtime of storage infrastructure and competitive market environment due to its being an innovative technology are expected to provide opportunities for the growth of the SDS market.

Data security and compliance software is expected to be the largest contributor in the global SDS market during the forecast period

Organizations have to mandatory follow the compliance policies and guidelines for storing and sharing data while securing business-critical information. Also, there is a need to take actions for storing and sharing data while securing the business-critical information. The requirement of security and compliance function in the existing SDS solution while storing the data has increased the demand for this software and is expected to contribute the highest in the overall revenue generation for the SDS market during the forecast period.

The support and maintenance segment is expected to show significant growth rate during the forecast period

The demand for services is significantly increasing along with the growth of the SDS market. Software and maintenance services help organizations to get the maximum benefits from their SDS software investment. The customers can get better assistance and maintenance for their SDS solution with various levels of support programs. The market for support and maintenance will keep growing owing to the need for consistent support required for deploying and utilizing the SDS solution.

Additionally, we are seeing data reduction solutions added to SDS that enable them to become extremely efficient in data storage use while improving data density and optimizing data center footprint.

Read more


Global Data Center Storage Market Is Expected to Reach US$ 29 Billion 2021

| SBWire

The growth of data center storage market worldwide is very interesting because, it is not only growing but also witnessing paradigm shift in segment and sub segment. Increasing internet penetration, multiple devices for professional and personnel uses have led to the growth of exponential growth of data. The regulatory requirement to keep data by enterprises for longer period of time led to enhance the size of the data center storage. There are several innovations being carried out by the vendors to address the storage challenges such as security, power consumptions by the devices, processing, scalability, and compatibility among different vendor’s data center equipment.

The analysts at Publisher has analysed a dual demand and supply growth in the data center storage market. The analysts expect the data center storage market to reach almost US$48.6 in 2021 at a CAGR of 14.4%. Most of the growth is expected to come from the SAN followed by the NAS segment. The report also covers a detailed analysis of the market size and growth forecast of SAN and NAS segment.

Data Center Storage Market- Trends, Drivers and Restraints
The most emerging trend in data center is the use of flash storage, also known as SSD storage. the market will experience more adoption of flash and traditional storage systems, where flash is likely to add more revenue in the upcoming years. Both storage systems have their own advantages and disadvantages when it comes to implementation. Vendors in the storage market are offering hybrid SAN-NAS solutions, through which business can merge block- and file-based data onto a common array. Increased use of Software Data Center Storage (SDDC) and the initiations of using combined SAN-NAS systems are few of the emerging trends mentioned in details in the report. In addition the emergence of ubiquitous data reduction technologies will make the data center denser and more efficient.

Furthermore, the boom of big data analytics has triggered widespread technological progress in the data center storage arena. It has generated the demand of storage devices to store data and servers to operate real-time applications. The aggressive approaches the SMEs and SMBs and convince them to move from on premise to cloud-based storage have, which led to create a huge demand for data center storage. Data center down time is one of the critical challenges every data center faces. It comes in the form of data center outage due to power shortage, outage due to seismic activity, outage due to fluid leak, or even security threats.

Data Center Storage Market- Geographic Analysis
The report includes the market analysis of different regions such as North America, Latin America, APAC, Europe and MEA. The report outlines the major market share holder and the market size analysis of all the regions. North American market has shown significant growth in shipment of SAN products compared to other geographies. The reason being, almost 30% of large data centers are upgrading their storage requirement for next 10 years, in which they are adding almost 3-4 timer more storage capacity than they have currently available. APAC accounts for more than 28% market share, and is expected to be the market leader by 2021 with a market share of almost 34%. This significant growth in market share is primarily due to increasing construction of new data center in SEA and China such as Google data centers in Singapore, Microsoft data center in India. SAN are the major revenue contributor in this regions.

Mostly, the newly constructed data center in Nordic Region is contributing the growth of IT equipment in Europe. Nordic has emerged as the most preferred place for new data center construction due to the free cooling technique and low OPEX. Latin America and MEA market will take another 3-4 years to show a significant growth in terms of both shipment and deployment.

Data Center Storage Market – Key Vendors and Market Share
This market research profiles the major companies in the Global Data Center Storage Market and also provides the competitive landscape and market share of key the players. The report covers the entire market outlook regarding the value chain operating within the market.

The Major Vendors include:
Dell EMC, Net APP, HP, IBM, Dell, Hitachi Data System.

The Emerging Vendors are:
Huawei, Fujitsu, Data Direct Network, Nimble Storage, NEC.

Other Prominent Vendors included in the report are:
American Megatrends, Lenovo, Nfina, Nimbus Data, Overland Storage, Oracle, Pure Storage, Promise Technology, Quanta Computer, Netgear, Tegile, Tintri, Toshiba, Violin Memory, X-IO Technologies, Supermicro.

Read more


Permabit pulls on Red Hat, opens arms for a Linux cuddle

| The Register

Crimson headcover kernel gets dedupe and compression

The Mad Hatter of Linux is getting Alice in Wonderland style physical space virtualisation with thin provisioning, compression and deduplication courtesy of Permabit.

Building on its June 2016 Linux availability announcement, Permabit and Red Hat have a go-to-market partnership based on the former’s Albireo deduplication and HIOPS compression technology being added as a kernel module to Red Hat Linux. Up until now dedupe and compression have largely been storage array features, and then appearing in software-only storage running on servers and direct-attached disks, SSDs or JBODs.

Against that background Permabit has had somewhat limited success as a supplier to OEMs of its Virtual Data Optimizer (VDO) dedupe and compression technology, potential customers largely preferring to build their own dedupe tech. Its most prominent OEM is probably HDS for file storage, via its BlueArc acquisition. Now that RHEL, via Permabit’s VDO, has its own kernel-level dedupe and compression that means any attached storage can get the benefit of it.

Permabit CEO Tom Cook is especially keen on the COLO angle here. Take a cloud service provider or general colocation operator fitting up their facility with racks of Linux-running servers and storage trays. If they can somehow reduce their storage capacity by, say 25 per cent for a year, and then 25 per cent for the next year and so on, then that removes a significant slug of cost from their annual budgets; that’s the way Cook sees it and he has spreadsheet models and charts to backup his case.

Here’s a chart for a Linux Ceph storage setup, assuming a 2.5:1 data reduction rate and suggesting savings of $370,000 over 5 years with Permabit data reduction installed:


Permabit’s VDO runs anywhere RHEL runs – in physical servers, in virtual ones and in the public cloud – and enables Red Hat to compete against suppliers of deduping server operating systems, virtual server/storage systems, OpenStack and deduping storage arrays, according to Permabit. It typically provides 2.5:1 data reduction for unstructured data and up to 10:1 reduction for VM images.

VDO works with Ceph and Gluster and it’s payable via a subscription license starting at $199/year for 16TB. It’s available through Permabit resellers and system integrators. ®

Read more