Data Efficiency in the News


Focus on Hybrid Cloud Is Opportune


Hybrid cloud technology provides its users with the opportunity to place their workloads into environments according to suitability. By combining public and private cloud services, users can maximize the benefits of cloud migration.

Why IBM’s Focus on Hybrid Cloud Is Opportune

Though the private cloud lags behind the public cloud in terms of adoption, it’s not far behind. In 2015, the hybrid and private cloud spaces registered 45% growth, compared to the 51% growth in the public cloud space.

This growth was evident in RightScale’s 2016 report, which stated that private cloud adoption had risen from 63% in 2015 to 77% in 2016. As more and more companies add private cloud technology, the likelihood of hybrid cloud adoption should continue to improve. The rise in private cloud adoption caused hybrid cloud adoption to rise 58% in 2015 to 71% in 2016 on a YoY (year-over-year) basis.

Scalability and flexibility are driving hybrid cloud adoption

According to Lifehacker, a recently conducted survey done by Microsoft (MSFT) that involved 1,200 information technology leaders across the Asia-Pacific (VPL) region, including Australia (EWA), showed that 40% of Australian organizations were either already using or readying themselves to use hybrid cloud solutions. This number is estimated to rise to 49% within next one to one-and-a-half years.

It’s the scalability and flexibility offered by the hybrid cloud that enables an enterprise to develop an automated, secure, and integrated computing environment in a cost-effective manner.

Hybrid cloud offers a combination of features of two or more cloud models—private, community, and public. The hybrid cloud provides the control and security of a private cloud along with the versatility, user friendliness, and cost-effectiveness of a public cloud. It’s preferred by organizations because even when using cloud services, they still have control over their data. Thus, hybrid cloud technology offers more control, cost structure efficiency, reduced risk, and better performance. These factors are likely a significant factor in the hybrid cloud’s increased adoption.

Read more


HPE sees Synergy in hybrid cloud infrastructure

| PC World Australia

HPE originally pitched its Synergy line of “composable” IT infrastructure as a way to bring the flexibility of cloud services to on-premises systems. Now it’s turning that story around, putting those same Synergy components — and some new ones — into the public cloud with the goal of simplifying hybrid IT management.

The new components of Synergy made their debut in London on Tuesday, at HPE Discover, an event for the company’s customers and partners.

Among the new offerings are a software update for the HPE Hyper Converged 380 server, and a new version of HPE Helion CloudSystem. Both incorporate new cloud management functions intended to simplify the automation of repetitive tasks. There are also two new ways to pay for it all, HPE Dynamic Usage for Hyper Converged Systems, and HPE Flexible Capacity Service, and some deft financial engineering to move some of the business risks onto partners.

HPE’s goal is to allow IT departments to act as service providers for their organization, rather than maintaining infrastructure.

“The most efficient way to deal with the cloud point of view is to take it from an application perspective, start with the workload and derive the infrastructure to support it,” said Matt Foley, HPE’s director of cloud presales in Europe, the Middle East and Africa.

Read more


Rackspace Achieves AWS Premier Consulting Partner Status in the AWS Partner Network

| Marketwired

Rackspace® today announced that it has achieved Premier Partner status as a Consulting Partner within the Amazon Web Services® (AWS) Partner Network (APN). This designation is the highest level in the APN, recognizing APN Partners that have made significant investments to develop the technical resources and AWS expertise necessary to deploy and manage customer solutions on the AWS Cloud. Customers can tap into this valuable expertise through Fanatical Support® for AWS architects and engineers, who collectively hold more than 600 AWS professional and associate certifications across the globe.

To qualify for the AWS Premier Consulting Partner tier, Partners must meet requirements that demonstrate the scale of their AWS expertise, capabilities and engagement in the AWS Ecosystem. Rackspace has been an APN Advanced Consulting Partner since the launch of Fanatical Support for AWS in October 2015. Its global base of knowledge spans all five AWS technical certifications, including Solutions Architect Associate, Developer Associate, SysOps Administrator Associate, DevOps Engineer Professional, and Solutions Architect Professional. Rackspace has also demonstrated expertise across different types of AWS workloads and has achieved AWS Competencies for DevOps and Marketing & Commerce.

“We are proud to be recognized as a Premier Consulting Partner in the APN,” said Jeff Cotten, senior vice president of AWS at Rackspace. “Since the launch of Fanatical Support for AWS in late 2015, we have been focused on helping our customers maximize the value of their AWS investments by developing our expertise and capabilities on AWS. Our team has worked so hard to achieve Premier Partner status in such a short time, and we look forward to continuing to build on our ability to provide Fanatical Support for AWS to our customers.”

Read more


Debunking the multi-cloud myths

| Information Age

As cloud adoption continues to grow, so has organisations’ use of multi-cloud. But as this technology has emerged so have some myths surrounding it.

A recent report from Rightscale has revealed that businesses are now using an average of six separate clouds.

Salesforce offered cloud as a service back in 1999. The technology has been around for awhile, despite many believing it is still a fairly recent innovation.

The majority of businesses are using the cloud to improve agility and flexibility across operations.

Such is the benefit that the cloud can bring, businesses are turning to using more than one more frequently: a multi-cloud.

Multi-cloud is an environment where applications are deployed across two or more cloud platforms.

It is evident that businesses do benefit from its use (and the cloud in general) with higher performance and cost efficiency capabilities by choosing a configuration of cloud platforms and technologies tailored to suit their needs.

As this ‘newish’ trend of multi-cloud gains popularity there are still those organisations that choose not to use it based on the challenges they think adopting multi-cloud will bring.

Myths have started to emerge around what businesses will find challenging about multi-cloud services, which is preventing CIOs from understanding what the multi-cloud is, or how it can benefit the wider business.

The myths

“Multi-cloud has for too long been the sleeping giant of the cloud computing world, with many IT leaders misinterpreting its meaning and therefore believing it doesn’t exist – even within their own organisations.”

“With businesses often ending up employing multi-cloud by accident due other departments employing cloud services without their knowledge, it is crucial that multi-cloud is understood and managed. If it is left for too long, it can cause headaches further down the line in terms of security and compliance”

You need to be a big business to really benefit from multi-cloud

The myth that multi-cloud is only for big businesses is perpetuated, quite rightly, by the assumption that in bigger organisations there is a multitude of differing opinions on what employees want to use.

It is therefore a natural evolution for them to have more than one cloud service. With this comes the benefits of a multi-vendor strategy, such as cost savings, more innovation and risk management.

Such benefits are however, not just for the big players.

Using multi-cloud is less secure

More clouds, more problems? This myth centres on the fact that, with the increasing complexity of multiple clouds, comes a greater risk of security issues.

But this is not necessarily true if well managed. Instead, it’s worth looking at it from the opposing view – how can a multi-cloud strategy help you be more secure and compliant.

Isn’t it the same as hybrid cloud?

Not at all. Multi-cloud helps to describe an increasingly common architecture and typically implies several key distinctions from the other commonly used term, ‘hybrid cloud’.

Although some commentators and analysts still use the terms interchangeably, hybrid cloud is actually a specific type of multi-cloud architecture.

The technical expertise required for multi-cloud is the biggest barrier to having a comprehensive strategy

It’s true that learning the ins and outs of the infrastructure and lingo of more than one cloud can be challenging, especially for a smaller company.

Meanwhile, bigger companies are faced with tremendous competition to retain the specialised engineers and architects who are versed in multi-cloud, meaning that even they often struggle to keep the required skills.

However, this isn’t a burden that IT departments need to shoulder alone and it doesn’t need to get in the way of benefiting from multiple clouds.

A first step should be auditing specific cloud services employees are using.

Based on the outcome you can assess the level of expertise that already exists in the business and find out where the gaps are.

From there you can also look externally and determine whether or not you need cloud brokers or managed cloud providers to execute and manage it successfully.

This can remove the burden from teams, freeing up time to focus on activities that help drive the business forward.

Read more


Docker and Canonical Partner on CS Docker Engine for Millions of Ubuntu Users

| Yahoo UK & Ireland Finance

Docker and Canonical today announced an integrated Commercially Supported (CS) Docker Engine offering on Ubuntu, providing Canonical customers with a single path for support of the Ubuntu operating system and CS Docker Engine in enterprise Docker operations.

This commercial agreement provides for a streamlined operations and support experience for joint customers. Stable, maintained releases of Docker will be published and updated by Docker, Inc., as snap packages on Ubuntu, enabling direct access to the Docker, Inc. build of Docker for all Ubuntu users. Canonical will provide Level 1 and Level 2 technical support for CS Docker Engine backed by Docker, Inc. providing Level 3 support. Canonical will ensure global availability of secure and Ubuntu images on Docker Hub.

Ubuntu is widely used as a devops platform in container-centric environments. “The combination of Ubuntu and Docker is popular for scale-out container operations, and this agreement ensures that our joint user base has the fastest and easiest path to production for CS Docker Engine devops,” said John Zannos, Vice President of Cloud Alliances and Business Development, Canonical.

CS Docker Engine is a software subscription to Docker’s flagship product backed by business day and business critical support. CS Docker Engine includes orchestration capabilities that enable an operator to define a declarative state for the distributed applications running across a cluster of nodes, based on a decentralized model that allows each Engine to be a uniform building block in a self-organizing, self-healing distributed system.


Read the source article at Yahoo UK & Ireland Finance

Read more


SUSE acquires OpenStack IaaS and Cloud Foundry PaaS assets from HPE

| Geekzone

SUSE has entered into an agreement with Hewlett Packard Enterprise (HPE) to acquire technology and talent that will expand SUSE’s OpenStack Infrastructure-as-a-Service (IaaS) solution and accelerate SUSE’s entry into the growing Cloud Foundry Platform-as-a-Service (PaaS) market.

The acquired OpenStack assets will be integrated into SUSE OpenStack Cloud, and the acquired Cloud Foundry and PaaS assets will enable SUSE to bring to market a certified, enterprise-ready SUSE Cloud Foundry PaaS solution for all customers and partners in the SUSE ecosystem. The agreement includes HPE naming SUSE as its preferred open source partner for Linux, OpenStack and Cloud Foundry solutions. In addition, SUSE has increased engagement with the Cloud Foundry Foundation, becoming a platinum member and taking a seat on the Cloud Foundry Foundation board. 

“The driving force behind this acquisition is SUSE’s commitment to providing open source software-defined infrastructure technologies that deliver enterprise value for our customers and partners,” said Nils Brauckmann, CEO of SUSE. “This also demonstrates how we’re building our business through a combination of organic growth and technology acquisition. Once again, this strategy sends a strong message to the market and the worldwide open source community that SUSE is a company on the move.”

Ashish Nadkarni, program director, Computing Platforms, for IDC, said, “This expanded partnership and transfer of technology assets has the potential to be a real win-win for SUSE and HPE, as well as customers of both companies. SUSE has proven time and again it can successfully work with its technology partners to help organizations glean maximum benefit from their investments in open source. SUSE is positioning itself very well as a high-growth company with the resources it needs to compete in key market segments.”

As part of the transaction, HPE has named SUSE as its preferred open source partner for Linux, OpenStack IaaS and Cloud Foundry PaaS. HPE’s choice of SUSE as its preferred open source partner further cements SUSE’s reputation for delivering high-quality, enterprise-grade open source solutions and services.

Abby Kearns, executive director of the Cloud Foundry Foundation, said, “SUSE has been a powerful player in the enterprise open source world for more than two decades, and I’m excited to see the impact that a SUSE Cloud Foundry distribution will have for enterprises and developers around the world. SUSE’s strategic vision for the convergence of Platform-as-a-Service and Container-as-a-Service technologies will also be a welcome addition to the strategic dialogue we have within the Cloud Foundry Foundation community.”

Read more


HPE core servers and storage under pressure

| The Register

HPE’s latest results show a company emerging slimmer and fitter through diet (cost-cutting) and exercise (spin-merger deals) but facing tougher markets in servers and storage – the new normal, as CEO Meg Whitman says.

A look at the numbers and the earnings call from the servers and storage points of view shows a company with work to do.

The server business saw revenue of $3.5bn in the quarter, down 7 per cent year-on-year and up 5 per cent quarter-on-quarter. High-performance compute (Apollo) and SGI servers did well. Hyper-converged is growing and has more margin than the core ISS (Industry Standard Servers). Synergy and mission critical systems also did well.

But the servers business was affected by strong pressure on the core ISS ProLiant racks, a little in the blade server business, and also low or no profitability selling Cloudline servers, the ones for cloud service providers and hyperscale customers.

In the earnings call, Meg Whitman discussed the ISS business, saying: “Other parts of the server business are doing really well. And I think that core ISS rack deterioration has a number of different things. One is in part our execution in the channel and pricing and things like that. And the second is the move to the public cloud.”

She also mentioned that there was increased competition from Huawei in servers.

Her answer is: “We need to shore up core ISS racks with improvements in the channel, improvements in quote to cash, and focus – more focus on the distributors and VARs for the volume-related ISS rack business.”

She thinks the ISS business can grow 1-2 per cent if this is done and because profitable gear like storage gets attached to these servers, HPE is “gaining share profits in this business”.


Although HPE’s CEO said hyper-converged was doing well, there is some way to go. Gartner ranks HPE as the leader in the hyper-converged and integrated systems magic quadrant, with EMC second and Nutanix third.

The analysis company’s researchers said: “Hewlett Packard Enterprise offers multiple converged, hyper-converged, reference architectures and point systems of various design points. But as the volume market leader in many segments (including blade and rack servers), it is only logical that HPE should be a leading vendor in this market.”

As Nutanix is just a hyper-converged player then HPE is not a leader in hyper-converged systems with its HC 380. The Gartnerites point out that “HPE is a relative late starter in HCIS and is frequently absent from competitive hyper-convergence evaluations versus more established vendors.”

An August Forester Wave report on hyper-converged systems put HPE in eighth position. Forester’s researchers said: “HPE’s product is in its early stages, and… its position in the HCI segment should improve quickly over time.”

Nothing was said in the call about any merger or acquisition in this area. There have been rumours about HPE and SimpliVity getting together.


In the all-flash array (AFA) business, HPE grew 3PAR AFA revenues 100 per cent year-on-year to a $750m annual run rate, which compares with NetApp at $1bn and Pure at $631m. Our sense is that Dell-EMC leads this market, followed by NetApp, then HPE, with Pure in fourth place.

Whitman said: “All-flash now makes up 50 per cent of our 3PAR portfolio and interestingly still only comprises 10 per cent of the data centre. So we see more running room in our all-flash business. And… we’re introducing new deduplication technology that should provide some further uplift in all-flash array, because there has been a gap in our portfolio.”

Comparing HPE to other AFA suppliers we see Dell EMC with five AFA products: XtremIO, DSSD, all-flash VMAX and Unity, and an all-flash Isilon product. NetApp has three: EF series, SolidFire, and all-flash FAS. Pure has its FlashArray and is developing FlashBlade. HPE has the single all-flash 3PAR product. This looks to be insufficient to cover the developing AFA use cases such as high-speed analytics, scale-out cloud service provision and file access.

There is no sense from HPE that it recognises this as a problem area. We see here a reflection of a view that HPE has a proliferation of server products and a relative scarcity of successful storage products. Historically in HPE, server and storage business units have followed separate paths. With them both inside Antonio Neri’s Enterprise Systems organisation, any such separateness should diminish.


Read more


Hewlett-Packard revenues shrink, just like the company

| MarketWatch

The separation that split Hewlett-Packard Co. into two smaller companies a year ago has done nothing to turn either company into the nimbler, faster growing entities Meg Whitman hoped for.

That was clear on Tuesday, when both Hewlett Packard Enterprise Co. HPE, +1.76% and HP Inc. HPQ, +1.13%  reported their fiscal fourth-quarter results. HPE, focused on the corporate computing market selling servers, networking equipment and services, reported a 7% drop in revenue in the fourth quarter, and annual revenue of $50.1 billion was down 4% from fiscal 2015, adjusted for the split. HPE claimed that revenue was up 2% year-over-year, when adjusted for divestitures and currency.

The PC and printing focused company fared slightly better in the fourth quarter, with 2% growth in revenue, based on strong sales of some of the company’s new PC products, but overall for fiscal 2016, revenue fell 6% to $48.2 billion.

So both companies are suffering from shrinking revenue while undergoing massive layoffs and stock buybacks that have placated investors, but done little to actually strengthen the companies while forcing restructuring charges. Still, with the strong stock performance of the past year on their side, HP leaders spoke optimistically of their companies.

That course is to get even smaller, in the case of HP Enterprise, with plans announced earlier this year to spin/merge both its services and its software business with CSC and MicroFocus respectively. Again, Whitman is promising that these deals will enable HP Enterprise to be “more nimble, provide cutting-edge solutions, play in higher-growth markets, and have an enhanced financial profile” sometime in 2017, when the deals are both complete.

But both HPs are still in too many legacy, slower-growing businesses to believe growth is around the corner. HP Inc.’s year-over-year 4% gain in overall PC sales, due to more competitive PCs, was overshadowed by the printing business, where sales of supplies (including printer ink) dropped 12%. HP executives pointed out that this decline had abated from a steeper drop of 18% in supplies in the previous third quarter, but slower shrinkage didn’t satisfy investors, and shares were down 2.3% in after-hours trading.

HP Enterprise is still burdened with a legacy server business that declined 6%, even as its high-performance computing business is growing with help on the way from the acquisition of SGI Corp. Whitman has bet on companies eventually moving to hybrid cloud structures, a mixture of on-premises and remote servers, but she admitted Tuesday that HPE is “definitely seeing impact” from potential clients choosing public cloud-computing environments.

If HP Inc. can’t get both its legacy consumer businesses to grow at the same time, and HP Enterprise can’t convince companies to at least partly eschew public cloud services like Amazon Web Services AMZN, +0.10% growth will be nearly impossible for either company. That only leaves Whitman and Weisler with more spinouts, divestitures, layoffs and stock buybacks to distract from their shrinking revenue.

Read more


Microsoft embraces open source in the cloud and on-premises


Microsoft has offered multiple flavors of Linux on its Azure cloud public cloud platform and infrastructure for several years now.

“Microsoft loves Linux,” Microsoft CEO Satya Nadella said during the 2014 announcement of new Azure services. “Twenty percent of Azure is already Linux. We will always have first-class support for Linux [distributions].”

Microsoft took that love another step last week. In a move that would have been stunning more than a decade ago, it joined The Linux Foundation — which sponsors the work of Linux creator Linus Torvalds and plays a central role in the promotion of open source software — as a platinum sponsor.

“As a cloud platform company, we aim to help developers achieve more using the platforms and languages they know,” Scott Guthrie, executive vice president, Microsoft Cloud and Enterprise Group, said in a statement last week. “The Linux Foundation is home not only to Linux, but many of the community’s most innovative open source projects. We are excited to join The Linux Foundation and partner with the community to help developers capitalize on the shift to intelligent cloud and mobile experiences.”

The reason, Microsoft’s Kumar says, is simple: In the messy, real world of enterprise IT, hybrid shops are the norm and customers don’t need or want vendors to force their hands when it comes to operating systems. Serving these customers means giving them flexibility.

That philosophy has spread from Microsoft’s cloud business to its on-premises infrastructure business as the company seeks to make support for hybrid environments a key differentiator of its cloud and on-premise offerings (an idea Nadella pushed as Microsoft’s executive vice president of Cloud and Enterprise before his ascension to CEO). Last week, Joseph Sirosh, corporate vice president of the Data Group at Microsoft, announced that the next release of SQL Server would, for the first time, support Linux.

Now you can also develop applications with SQL Server on Linux, Docker or macOS (via Docker) and then deploy to Linux, Windows, Docker, on-premises or in the cloud,” Sirosh wrote in a blog post. “This represents a major step in our journey to making SQL Server the platform of choice across operating systems, development languages, data types, on-premises and in the cloud.”

Kumar adds that customers tell Microsoft, “I want to use SQL and don’t care about what’s underneath it. I don’t want to worry about it, I just want to know that whenever I want to install SQL, I have the choice to do that.”

All major features of the SQL Server relational database engine are coming to Linux, Sirosh said, including advanced features such as in-memory online transactional processing (OLTP), in-memory columnstores, Transparent Data Encryption, Always Encrypted and Row-Level Security. There will be native Linux installations with familiar RPM and APT packages for Red Hat Enterprise Linux, Ubuntu Linux and SUSE Linux Enterprise Server. He noted that the public preview of the next release of SQL Server, in both Windows and Linux flavors, will be available on Azure Virtual Machines and as images on Docker Hub.

“I’m excited about Microsoft as a company truly embracing choice,” Kumar says. “We’re clearly seeing the base getting energized in a big way. People are giving us a chance again.”

Read more


Hybrid Cloud Storage Use to Double in Next 12 Months


The use of hybrid cloud storage will accelerate rapidly over the next 12 months, according to research published today by Cloudian, Inc., in cloud-compatible object storage systems.

Across 400 organisations surveyed in the UK and USA, 28% already use hybrid cloud storage, with a further 40% planning to implement within the next year. Only 19% have no plans to adopt.

Organisations are looking to hybrid cloud storage to support a variety of workloads.  backup is the most popular use case, with 64% of respondents reporting deployment or plans to deploy. Web infrastructure (52%), application dev/test (48%) and technical applications (43%) are also driving the adoption of hybrid cloud storage products and services.

The research reveals that larger organisations (2,500 employees or more) are adopting the approach most rapidly, with 82% planning to deploy in the next 12 months.

Decisions about whether to adopt hybrid cloud storage are being driven by multiple factors such as external and internal data governance rules. 59% of respondents report that not all of their data can go to the public cloud, and that more than half of their data must remain on site. Most commonly cited among the data types that must remain on premises are financial data and customer records. Reasons named most commonly are security, governance and compliance rules, driven by both internal policy and external regulation.

When considering a hybrid cloud storage strategy, concerns about interoperability between on-premises and public cloud storage (40%) are only exceeded by those around security (62%) and cost (55%). 76% of respondents moving to hybrid cloud storage have yet to decide which interface to adopt.


Read more