Data Efficiency in the News


Debunking the multi-cloud myths

| Information Age

As cloud adoption continues to grow, so has organisations’ use of multi-cloud. But as this technology has emerged so have some myths surrounding it.

A recent report from Rightscale has revealed that businesses are now using an average of six separate clouds.

Salesforce offered cloud as a service back in 1999. The technology has been around for awhile, despite many believing it is still a fairly recent innovation.

The majority of businesses are using the cloud to improve agility and flexibility across operations.

Such is the benefit that the cloud can bring, businesses are turning to using more than one more frequently: a multi-cloud.

Multi-cloud is an environment where applications are deployed across two or more cloud platforms.

It is evident that businesses do benefit from its use (and the cloud in general) with higher performance and cost efficiency capabilities by choosing a configuration of cloud platforms and technologies tailored to suit their needs.

As this ‘newish’ trend of multi-cloud gains popularity there are still those organisations that choose not to use it based on the challenges they think adopting multi-cloud will bring.

Myths have started to emerge around what businesses will find challenging about multi-cloud services, which is preventing CIOs from understanding what the multi-cloud is, or how it can benefit the wider business.

The myths

“Multi-cloud has for too long been the sleeping giant of the cloud computing world, with many IT leaders misinterpreting its meaning and therefore believing it doesn’t exist – even within their own organisations.”

“With businesses often ending up employing multi-cloud by accident due other departments employing cloud services without their knowledge, it is crucial that multi-cloud is understood and managed. If it is left for too long, it can cause headaches further down the line in terms of security and compliance”

You need to be a big business to really benefit from multi-cloud

The myth that multi-cloud is only for big businesses is perpetuated, quite rightly, by the assumption that in bigger organisations there is a multitude of differing opinions on what employees want to use.

It is therefore a natural evolution for them to have more than one cloud service. With this comes the benefits of a multi-vendor strategy, such as cost savings, more innovation and risk management.

Such benefits are however, not just for the big players.

Using multi-cloud is less secure

More clouds, more problems? This myth centres on the fact that, with the increasing complexity of multiple clouds, comes a greater risk of security issues.

But this is not necessarily true if well managed. Instead, it’s worth looking at it from the opposing view – how can a multi-cloud strategy help you be more secure and compliant.

Isn’t it the same as hybrid cloud?

Not at all. Multi-cloud helps to describe an increasingly common architecture and typically implies several key distinctions from the other commonly used term, ‘hybrid cloud’.

Although some commentators and analysts still use the terms interchangeably, hybrid cloud is actually a specific type of multi-cloud architecture.

The technical expertise required for multi-cloud is the biggest barrier to having a comprehensive strategy

It’s true that learning the ins and outs of the infrastructure and lingo of more than one cloud can be challenging, especially for a smaller company.

Meanwhile, bigger companies are faced with tremendous competition to retain the specialised engineers and architects who are versed in multi-cloud, meaning that even they often struggle to keep the required skills.

However, this isn’t a burden that IT departments need to shoulder alone and it doesn’t need to get in the way of benefiting from multiple clouds.

A first step should be auditing specific cloud services employees are using.

Based on the outcome you can assess the level of expertise that already exists in the business and find out where the gaps are.

From there you can also look externally and determine whether or not you need cloud brokers or managed cloud providers to execute and manage it successfully.

This can remove the burden from teams, freeing up time to focus on activities that help drive the business forward.

Read more


Docker and Canonical Partner on CS Docker Engine for Millions of Ubuntu Users

| Yahoo UK & Ireland Finance

Docker and Canonical today announced an integrated Commercially Supported (CS) Docker Engine offering on Ubuntu, providing Canonical customers with a single path for support of the Ubuntu operating system and CS Docker Engine in enterprise Docker operations.

This commercial agreement provides for a streamlined operations and support experience for joint customers. Stable, maintained releases of Docker will be published and updated by Docker, Inc., as snap packages on Ubuntu, enabling direct access to the Docker, Inc. build of Docker for all Ubuntu users. Canonical will provide Level 1 and Level 2 technical support for CS Docker Engine backed by Docker, Inc. providing Level 3 support. Canonical will ensure global availability of secure and Ubuntu images on Docker Hub.

Ubuntu is widely used as a devops platform in container-centric environments. “The combination of Ubuntu and Docker is popular for scale-out container operations, and this agreement ensures that our joint user base has the fastest and easiest path to production for CS Docker Engine devops,” said John Zannos, Vice President of Cloud Alliances and Business Development, Canonical.

CS Docker Engine is a software subscription to Docker’s flagship product backed by business day and business critical support. CS Docker Engine includes orchestration capabilities that enable an operator to define a declarative state for the distributed applications running across a cluster of nodes, based on a decentralized model that allows each Engine to be a uniform building block in a self-organizing, self-healing distributed system.


Read the source article at Yahoo UK & Ireland Finance

Read more


SUSE acquires OpenStack IaaS and Cloud Foundry PaaS assets from HPE

| Geekzone

SUSE has entered into an agreement with Hewlett Packard Enterprise (HPE) to acquire technology and talent that will expand SUSE’s OpenStack Infrastructure-as-a-Service (IaaS) solution and accelerate SUSE’s entry into the growing Cloud Foundry Platform-as-a-Service (PaaS) market.

The acquired OpenStack assets will be integrated into SUSE OpenStack Cloud, and the acquired Cloud Foundry and PaaS assets will enable SUSE to bring to market a certified, enterprise-ready SUSE Cloud Foundry PaaS solution for all customers and partners in the SUSE ecosystem. The agreement includes HPE naming SUSE as its preferred open source partner for Linux, OpenStack and Cloud Foundry solutions. In addition, SUSE has increased engagement with the Cloud Foundry Foundation, becoming a platinum member and taking a seat on the Cloud Foundry Foundation board. 

“The driving force behind this acquisition is SUSE’s commitment to providing open source software-defined infrastructure technologies that deliver enterprise value for our customers and partners,” said Nils Brauckmann, CEO of SUSE. “This also demonstrates how we’re building our business through a combination of organic growth and technology acquisition. Once again, this strategy sends a strong message to the market and the worldwide open source community that SUSE is a company on the move.”

Ashish Nadkarni, program director, Computing Platforms, for IDC, said, “This expanded partnership and transfer of technology assets has the potential to be a real win-win for SUSE and HPE, as well as customers of both companies. SUSE has proven time and again it can successfully work with its technology partners to help organizations glean maximum benefit from their investments in open source. SUSE is positioning itself very well as a high-growth company with the resources it needs to compete in key market segments.”

As part of the transaction, HPE has named SUSE as its preferred open source partner for Linux, OpenStack IaaS and Cloud Foundry PaaS. HPE’s choice of SUSE as its preferred open source partner further cements SUSE’s reputation for delivering high-quality, enterprise-grade open source solutions and services.

Abby Kearns, executive director of the Cloud Foundry Foundation, said, “SUSE has been a powerful player in the enterprise open source world for more than two decades, and I’m excited to see the impact that a SUSE Cloud Foundry distribution will have for enterprises and developers around the world. SUSE’s strategic vision for the convergence of Platform-as-a-Service and Container-as-a-Service technologies will also be a welcome addition to the strategic dialogue we have within the Cloud Foundry Foundation community.”

Read more


HPE core servers and storage under pressure

| The Register

HPE’s latest results show a company emerging slimmer and fitter through diet (cost-cutting) and exercise (spin-merger deals) but facing tougher markets in servers and storage – the new normal, as CEO Meg Whitman says.

A look at the numbers and the earnings call from the servers and storage points of view shows a company with work to do.

The server business saw revenue of $3.5bn in the quarter, down 7 per cent year-on-year and up 5 per cent quarter-on-quarter. High-performance compute (Apollo) and SGI servers did well. Hyper-converged is growing and has more margin than the core ISS (Industry Standard Servers). Synergy and mission critical systems also did well.

But the servers business was affected by strong pressure on the core ISS ProLiant racks, a little in the blade server business, and also low or no profitability selling Cloudline servers, the ones for cloud service providers and hyperscale customers.

In the earnings call, Meg Whitman discussed the ISS business, saying: “Other parts of the server business are doing really well. And I think that core ISS rack deterioration has a number of different things. One is in part our execution in the channel and pricing and things like that. And the second is the move to the public cloud.”

She also mentioned that there was increased competition from Huawei in servers.

Her answer is: “We need to shore up core ISS racks with improvements in the channel, improvements in quote to cash, and focus – more focus on the distributors and VARs for the volume-related ISS rack business.”

She thinks the ISS business can grow 1-2 per cent if this is done and because profitable gear like storage gets attached to these servers, HPE is “gaining share profits in this business”.


Although HPE’s CEO said hyper-converged was doing well, there is some way to go. Gartner ranks HPE as the leader in the hyper-converged and integrated systems magic quadrant, with EMC second and Nutanix third.

The analysis company’s researchers said: “Hewlett Packard Enterprise offers multiple converged, hyper-converged, reference architectures and point systems of various design points. But as the volume market leader in many segments (including blade and rack servers), it is only logical that HPE should be a leading vendor in this market.”

As Nutanix is just a hyper-converged player then HPE is not a leader in hyper-converged systems with its HC 380. The Gartnerites point out that “HPE is a relative late starter in HCIS and is frequently absent from competitive hyper-convergence evaluations versus more established vendors.”

An August Forester Wave report on hyper-converged systems put HPE in eighth position. Forester’s researchers said: “HPE’s product is in its early stages, and… its position in the HCI segment should improve quickly over time.”

Nothing was said in the call about any merger or acquisition in this area. There have been rumours about HPE and SimpliVity getting together.


In the all-flash array (AFA) business, HPE grew 3PAR AFA revenues 100 per cent year-on-year to a $750m annual run rate, which compares with NetApp at $1bn and Pure at $631m. Our sense is that Dell-EMC leads this market, followed by NetApp, then HPE, with Pure in fourth place.

Whitman said: “All-flash now makes up 50 per cent of our 3PAR portfolio and interestingly still only comprises 10 per cent of the data centre. So we see more running room in our all-flash business. And… we’re introducing new deduplication technology that should provide some further uplift in all-flash array, because there has been a gap in our portfolio.”

Comparing HPE to other AFA suppliers we see Dell EMC with five AFA products: XtremIO, DSSD, all-flash VMAX and Unity, and an all-flash Isilon product. NetApp has three: EF series, SolidFire, and all-flash FAS. Pure has its FlashArray and is developing FlashBlade. HPE has the single all-flash 3PAR product. This looks to be insufficient to cover the developing AFA use cases such as high-speed analytics, scale-out cloud service provision and file access.

There is no sense from HPE that it recognises this as a problem area. We see here a reflection of a view that HPE has a proliferation of server products and a relative scarcity of successful storage products. Historically in HPE, server and storage business units have followed separate paths. With them both inside Antonio Neri’s Enterprise Systems organisation, any such separateness should diminish.


Read more


Hewlett-Packard revenues shrink, just like the company

| MarketWatch

The separation that split Hewlett-Packard Co. into two smaller companies a year ago has done nothing to turn either company into the nimbler, faster growing entities Meg Whitman hoped for.

That was clear on Tuesday, when both Hewlett Packard Enterprise Co. HPE, +1.76% and HP Inc. HPQ, +1.13%  reported their fiscal fourth-quarter results. HPE, focused on the corporate computing market selling servers, networking equipment and services, reported a 7% drop in revenue in the fourth quarter, and annual revenue of $50.1 billion was down 4% from fiscal 2015, adjusted for the split. HPE claimed that revenue was up 2% year-over-year, when adjusted for divestitures and currency.

The PC and printing focused company fared slightly better in the fourth quarter, with 2% growth in revenue, based on strong sales of some of the company’s new PC products, but overall for fiscal 2016, revenue fell 6% to $48.2 billion.

So both companies are suffering from shrinking revenue while undergoing massive layoffs and stock buybacks that have placated investors, but done little to actually strengthen the companies while forcing restructuring charges. Still, with the strong stock performance of the past year on their side, HP leaders spoke optimistically of their companies.

That course is to get even smaller, in the case of HP Enterprise, with plans announced earlier this year to spin/merge both its services and its software business with CSC and MicroFocus respectively. Again, Whitman is promising that these deals will enable HP Enterprise to be “more nimble, provide cutting-edge solutions, play in higher-growth markets, and have an enhanced financial profile” sometime in 2017, when the deals are both complete.

But both HPs are still in too many legacy, slower-growing businesses to believe growth is around the corner. HP Inc.’s year-over-year 4% gain in overall PC sales, due to more competitive PCs, was overshadowed by the printing business, where sales of supplies (including printer ink) dropped 12%. HP executives pointed out that this decline had abated from a steeper drop of 18% in supplies in the previous third quarter, but slower shrinkage didn’t satisfy investors, and shares were down 2.3% in after-hours trading.

HP Enterprise is still burdened with a legacy server business that declined 6%, even as its high-performance computing business is growing with help on the way from the acquisition of SGI Corp. Whitman has bet on companies eventually moving to hybrid cloud structures, a mixture of on-premises and remote servers, but she admitted Tuesday that HPE is “definitely seeing impact” from potential clients choosing public cloud-computing environments.

If HP Inc. can’t get both its legacy consumer businesses to grow at the same time, and HP Enterprise can’t convince companies to at least partly eschew public cloud services like Amazon Web Services AMZN, +0.10% growth will be nearly impossible for either company. That only leaves Whitman and Weisler with more spinouts, divestitures, layoffs and stock buybacks to distract from their shrinking revenue.

Read more


Microsoft embraces open source in the cloud and on-premises


Microsoft has offered multiple flavors of Linux on its Azure cloud public cloud platform and infrastructure for several years now.

“Microsoft loves Linux,” Microsoft CEO Satya Nadella said during the 2014 announcement of new Azure services. “Twenty percent of Azure is already Linux. We will always have first-class support for Linux [distributions].”

Microsoft took that love another step last week. In a move that would have been stunning more than a decade ago, it joined The Linux Foundation — which sponsors the work of Linux creator Linus Torvalds and plays a central role in the promotion of open source software — as a platinum sponsor.

“As a cloud platform company, we aim to help developers achieve more using the platforms and languages they know,” Scott Guthrie, executive vice president, Microsoft Cloud and Enterprise Group, said in a statement last week. “The Linux Foundation is home not only to Linux, but many of the community’s most innovative open source projects. We are excited to join The Linux Foundation and partner with the community to help developers capitalize on the shift to intelligent cloud and mobile experiences.”

The reason, Microsoft’s Kumar says, is simple: In the messy, real world of enterprise IT, hybrid shops are the norm and customers don’t need or want vendors to force their hands when it comes to operating systems. Serving these customers means giving them flexibility.

That philosophy has spread from Microsoft’s cloud business to its on-premises infrastructure business as the company seeks to make support for hybrid environments a key differentiator of its cloud and on-premise offerings (an idea Nadella pushed as Microsoft’s executive vice president of Cloud and Enterprise before his ascension to CEO). Last week, Joseph Sirosh, corporate vice president of the Data Group at Microsoft, announced that the next release of SQL Server would, for the first time, support Linux.

Now you can also develop applications with SQL Server on Linux, Docker or macOS (via Docker) and then deploy to Linux, Windows, Docker, on-premises or in the cloud,” Sirosh wrote in a blog post. “This represents a major step in our journey to making SQL Server the platform of choice across operating systems, development languages, data types, on-premises and in the cloud.”

Kumar adds that customers tell Microsoft, “I want to use SQL and don’t care about what’s underneath it. I don’t want to worry about it, I just want to know that whenever I want to install SQL, I have the choice to do that.”

All major features of the SQL Server relational database engine are coming to Linux, Sirosh said, including advanced features such as in-memory online transactional processing (OLTP), in-memory columnstores, Transparent Data Encryption, Always Encrypted and Row-Level Security. There will be native Linux installations with familiar RPM and APT packages for Red Hat Enterprise Linux, Ubuntu Linux and SUSE Linux Enterprise Server. He noted that the public preview of the next release of SQL Server, in both Windows and Linux flavors, will be available on Azure Virtual Machines and as images on Docker Hub.

“I’m excited about Microsoft as a company truly embracing choice,” Kumar says. “We’re clearly seeing the base getting energized in a big way. People are giving us a chance again.”

Read more


Hybrid Cloud Storage Use to Double in Next 12 Months


The use of hybrid cloud storage will accelerate rapidly over the next 12 months, according to research published today by Cloudian, Inc., in cloud-compatible object storage systems.

Across 400 organisations surveyed in the UK and USA, 28% already use hybrid cloud storage, with a further 40% planning to implement within the next year. Only 19% have no plans to adopt.

Organisations are looking to hybrid cloud storage to support a variety of workloads.  backup is the most popular use case, with 64% of respondents reporting deployment or plans to deploy. Web infrastructure (52%), application dev/test (48%) and technical applications (43%) are also driving the adoption of hybrid cloud storage products and services.

The research reveals that larger organisations (2,500 employees or more) are adopting the approach most rapidly, with 82% planning to deploy in the next 12 months.

Decisions about whether to adopt hybrid cloud storage are being driven by multiple factors such as external and internal data governance rules. 59% of respondents report that not all of their data can go to the public cloud, and that more than half of their data must remain on site. Most commonly cited among the data types that must remain on premises are financial data and customer records. Reasons named most commonly are security, governance and compliance rules, driven by both internal policy and external regulation.

When considering a hybrid cloud storage strategy, concerns about interoperability between on-premises and public cloud storage (40%) are only exceeded by those around security (62%) and cost (55%). 76% of respondents moving to hybrid cloud storage have yet to decide which interface to adopt.


Read more


Storage Survey: Cloud Storage Use Up, But Integration Still Sought


Storage is one of the primary drivers of hybrid cloud adoption. According to a survey by object storage system provider, Cloudian, 68% of IT decision makers surveyed in early November have already adopted or plan to implement within a year a form of public cloud storage to complement their existing on-premises storage.

The share that have already done so was 28%, with another 40% planning to do so within the next 12 months. The most frequent driver is use of cloud storage for backup purposes — in use by 64% or planned to be pressed into use in the next 12 months. By moving backup data into the cloud, the IT manager gets, not just another copy of his data, but one in a data center different from his own, and thus insulated from a disaster or failure that might strike his own.

On the other hand, 13% said they planned to adopt cloud storage but in a timeframe more than 12 months out, while 19% said they had no plans to adopt a cloud storage system to function in a hybrid fashion with an on-premises system.

The survey concluded that organizations are using cloud storage for specific applications and purposes, such as file sharing, email, and data backup. “But they continue to struggle when it comes to more general purpose storage needs as it relates to cloud,” and only 18% are using an on-premises application that leverages cloud storage in some way, the report said.

In addition to data backup, respondents named other uses for a hybrid storage system: Web infrastructure was cited by 52%; application development and test, 48%; technical applications, 43%; media and entertainment, 21%; and medical data, 21%.

Security and cost concerns remain the primary brakes on adopting hybrid cloud storage systems at a faster pace. Sixty-two percent of respondents said security in the cloud was a top concern, with cost coming in second at 55%.

Fifty-nine percent said they had data that could not be migrated to the cloud now or in the future, and that data represented 47% of the total. It included corporate financial data, customer records, research data, email, file shares and other collaboration data.

Other concerns were management complexity of hybrid storage system, interoperability with other systems, and the skill sets needed to use cloud storage listed as well.

When it came to global thinking about on-premises plus cloud storage, Cloudian found most respondents hadn’t moved off the application by application approach to the cloud. There was no emerging consensus on what a hybrid system designed to work together for multiple purposes looks like, it acknowledged.

When it asked the respondents who used no cloud storage what interface they would want for a hybrid system, “the respondents simply did not know what they would choose.” That sample was the smallest of the respondent groupings, 25.

Read more


Rackspace Opens First Data Centre In Continental Europe

| Stock Market

Rackspace® has expanded its investment in continental Europe by announcing that it will open a new data centre in Germany. Rackspace’s decision to expand its operations in the region provides customers with a new option for managed IT infrastructure in the face of strict personal data protection laws across Germany, Austria and Switzerland (DACH) territory.

The facility is designed to serve customers who seek managed private clouds and hosting environments, with a focus on fully managed VMware environments.

Rackspace will be working with one of its long-standing partners to build out the infrastructure of the new operation, which is expected to be fully operational in mid-2017.

To address increasing customer demand for managed cloud services, the plan to open the new facility in Frankfurt follows the recent appointment of Alex Fuerst as the leader of Rackspace operations in the DACH region. Alex has hired an in-region team focused on delivering managed private clouds and hosting environments, as well as managed cloud services for customers seeking help with the complexity and cost of managing AWS and Azure. Globally, Rackspace engineers have more than 500 AWS certifications and proven experience with architecting, deploying, operating and optimising both AWS and Azure customer environments. With this team of managed cloud specialists, Rackspace is poised to serve customers in the region with its expertise and broad portfolio of services for multi-cloud environments.

“I am excited to be able to bring this new Rackspace data centre online to serve our fast-expanding German customer base,” said Fuerst, who joined Rackspace in September 2016 after working in IT leadership roles in the DACH region. “We’re experiencing strong demand from DACH-centric customers, as well as U.S. and EMEA-based multinationals who are looking for managed private clouds and hosting environments, along with managed cloud services and expertise for AWS and Azure in continental Europe. This data centre will strengthen our multi-cloud capabilities on the European continent and pave the way for us to achieve our goal of becoming the leading managed cloud provider in Germany, Switzerland and Austria, which is already our third largest international market.”

“With the opening of our data centre in Germany, we can provide the highest level of availability, security, performance and management, and also help our customers address data protection requirements by providing them with multi-cloud deployment options. As the demand for managed services increases in the German-speaking region, companies of all sizes in all verticals are embracing multi-cloud approaches to IT, so that each of their workloads runs on the platform where it can achieve the best performance and cost efficiency,” Fuerst continued. “More and more of those companies are turning to Rackspace expertise and support for their critical IT services and data.”

With the addition of the data centre in Frankfurt, Rackspace will operate 12 data centres worldwide, including in London, Hong Kong, Sydney, Dallas, Chicago and Ashburn (near Washington, D.C.).

Read more


Software-Defined Storage Market Projected to Reach 22.56 Billion USD by 2021

| Stock Market

North America is expected to lead the Software-Defined Storage market as the governments in the region have initiated many projects related to digitalization of their countries, which is making the region the largest adopter of SDS solutions.[167 Pages Report] Software defined Storage Market categorizes the Global SDS Market by solutions as software-defined server, data security & compliance, controller, data management, and hypervisor, by services, by usage, by organization size, by application area & by geography.

According to report “Software-Defined Storage Market by Component [Platforms/Solutions (Software-Defined Server, Data Security & Compliance, Controller, Data Management, and Hypervisor), Services], Usage, Organization Size, Application Area – Global Forecast to 2021″, global market is expected to grow from USD 4.72 Billion in 2016 to USD 22.56 Billion by 2021, at a Compound Annual Growth Rate (CAGR) of 36.7%.

Exponentially growing data volume across enterprises, rise in “software defined” concept, and the need for cost optimization in data management are some of the major driving factors for the SDS market. Furthermore, avoiding downtime of storage infrastructure and competitive market environment due to its being an innovative technology are expected to provide opportunities for the growth of the SDS market.

Data security and compliance software is expected to be the largest contributor in the global SDS market during the forecast period

Organizations have to mandatory follow the compliance policies and guidelines for storing and sharing data while securing business-critical information. Also, there is a need to take actions for storing and sharing data while securing the business-critical information. The requirement of security and compliance function in the existing SDS solution while storing the data has increased the demand for this software and is expected to contribute the highest in the overall revenue generation for the SDS market during the forecast period.

The support and maintenance segment is expected to show significant growth rate during the forecast period

The demand for services is significantly increasing along with the growth of the SDS market. Software and maintenance services help organizations to get the maximum benefits from their SDS software investment. The customers can get better assistance and maintenance for their SDS solution with various levels of support programs. The market for support and maintenance will keep growing owing to the need for consistent support required for deploying and utilizing the SDS solution.

Additionally, we are seeing data reduction solutions added to SDS that enable them to become extremely efficient in data storage use while improving data density and optimizing data center footprint.

Read more