Data Efficiency in the News

curata__RBqgW7yFNqS6E6e.png

More than 50 Percent of Businesses Not Leveraging Public Cloud

| tmcnet.com
curata__RBqgW7yFNqS6E6e.png

While more than 50 percent of respondents are not currently leveraging public cloud, 80 percent plan on migrating more within the next year, according to a new study conducted by TriCore Solutions, the application management experts. As new streams of data are continuing to appear, from mobile apps to artificial intelligence, companies in the future will rely heavily on cloud and digital transformation to minimize complexity.

Here are some key results from the survey:

  • Public Cloud Considerations: Cloud initiatives are underway for companies in the mid-market up through the Fortune 500, though IT leaders continue to struggle with what to migrate, when to migrate, and how best to execute the process. More than half of those surveyed plan to migrate web applications and development systems to the public cloud in the next year, prioritizing these over other migrations. More than two thirds have 25 percent or less of their infrastructure in the public cloud, showing that public cloud still has far to go before it becomes the prevailing environment that IT leaders must manage. With increasingly complex hybrid environments, managed service providers will become a more important resource to help facilitate the process.
  • Running Smarter Still on Prem: Whether running on Oacle EBS, PeopleSoft or, companies rely on ERP systems to run their businesses. Only 20 percent of respondents expect to migrate ERP systems to public cloud in the next year, indicating the importance of hybrid cloud environments for companies to manage business-critical operations on premise alongside other applications and platforms in the public cloud.
  • Prepping for Digital Transformation: With the increased amount of data in today’s IT environment – from machine data to social media data to transactional data and everything in between – the need for managed service providers to make sense of it all has never been more important. 53 percent of respondents plan on outsourcing their IT infrastructure in the future, and respondents anticipate a nearly 20 percent increase in applications being outsourced in the future, as well.

As worldwide spending on cloud continues to grow, and with the increased amount of data in today’s IT environment, IT leaders need to heavily consider the keys to IT success when migrating to a cloud-based environment. Understanding how to help businesses unlock and leverage the endless data available to them, will drive IT success for managed service providers in 2017 and beyond.

 

Read more

curata__d1NS2937h9ezbqI.gif

Worldwide Enterprise Storage Market Sees Modest Decline in Third Quarter, According to IDC

| idc.com
curata__d1NS2937h9ezbqI.gif

Total worldwide enterprise storage systems factory revenue was down 3.2% year over year and reached $8.8 billion in the third quarter of 2016 (3Q16), according to the International Data Corporation (IDC) Worldwide Quarterly Enterprise Storage Systems Tracker. Total capacity shipments were up 33.2% year over year to 44.3 exabytes during the quarter. Revenue growth increased within the group of original design manufacturers (ODMs) that sell directly to hyperscale datacenters. This portion of the market was up 5.7% year over year to $1.3 billion. Sales of server-based storage were relatively flat, at -0.5% during the quarter and accounted for $2.1 billion in revenue. External storage systems remained the largest market segment, but the $5.4 billion in sales represented a decline of 6.1% year over year.

“The enterprise storage market closed out the third quarter on a slight downturn, while continuing to adhere to familiar trends,” said Liz Conner, research manager, Storage Systems. “Spending on traditional external arrays resumed its decline and spending on all-flash deployments continued to see good growth and helped to drive the overall market. Meanwhile the very nature of the hyperscale business leads to heavy fluctuations within the market segment, posting solid growth in 3Q16.”

Read more

curata__cO66GSq35duhcob.png

In-memory de-duplication technology to accelerate response for large-scale storage

| Phys.org
curata__cO66GSq35duhcob.png

Fujitsu Laboratories Ltd. today announced the development of a high-speed in-memory data deduplication technology for all-flash arrays, which are large-scale, high-speed storage systems and use multiple flash devices such as solid-state drives. This technology enables the production of storage systems with up to twice the response speed when writing data, compared to previous methods.

In recent years, all-flash arrays have incorporated deduplication technology that consolidates duplicate data into one to write to a flash device, in order to utilize the limited capacity of flash devices. However, as the system connects to multiple flash devices through a network in order to search for duplicate data each time it writes data, and storage devices grow in capacity and increase in speed, a problem of lowered response speed during write operations arises.
Fujitsu Laboratories has developed a new method that can accelerate response speeds by executing deduplication after writing data. In addition, as data may be written to memory twice in some cases when processing is continued with the new method, thereby increasing communications volume and lowering overall processing performance, Fujitsu Laboratories has developed technology to automatically switch between the new method and the previous method, as operational conditions require.

This means that response speeds can be increased by up to two times, improving the response of virtual desktop services and reducing database processing times.

 

Using the newly developed technology, Fujitsu Laboratories was able to achieve a lowest latency of about half that of previous in fio benchmarks. As a result, Fujitsu Laboratories was able to increase speed when writing data to an all-flash array by up to two times For example, in applications such as virtual desktops and data base processing that require high write speeds, as accessing small files occurs in enormous volumes, there are many duplications. In such situations, user applications on the service could be sped up, improving the user experience. In addition, by applying this system to back-end storage for operations databases, operations systems could be sped up, enabling further consolidation of IT infrastructure.

Fujitsu Laboratories will continue development of technologies to further accelerate all-flash arrays going forward, aiming to incorporate them into Fujitsu Limited’s storage products from fiscal 2017 or later.

 

Read more

curata__f423c5616ffe8e5b903e1d5a3d306c5b.PNG

WW Enterprise Storage Market Down 3% in 3Q16 From 3Q15

| storagenewsletter.com
curata__f423c5616ffe8e5b903e1d5a3d306c5b.PNG

Total WW enterprise storage systems factory revenue was down 3.2% year over year and reached $8.8 billion in 3Q16, according to the IDC Worldwide Quarterly Enterprise Storage Systems Tracker.

Total capacity shipments were up 33.2% year over year to 44.3 EBs during the quarter.

Revenue growth increased within the group of original design manufacturers (ODMs) that sell directly to hyperscale datacenters. This portion of the market was up 5.7% year over year to $1.3 billion.

Sales of server-based storage were relatively flat, at -0.5% during the quarter and accounted for $2.1 billion in revenue. External storage systems remained the largest market segment, but the $5.4 billion in sales represented a decline of 6.1% year over year.

The enterprise storage market closed out the third quarter on a slight downturn, while continuing to adhere to familiar trends,” said Liz Conner, research manager, storage systems. “Spending on traditional external arrays resumed its decline and spending on all-flash deployments continued to see good growth and helped to drive the overall market. Meanwhile the very nature of the hyperscale business leads to heavy fluctuations within the market segment, posting solid growth in 3Q16.”

Read more

curata__AJLNA97OQToCJN4.png

Digitally Advanced Traditional Enterprises Are Eight Times More Likely to Grow Share

| Stock Market
curata__AJLNA97OQToCJN4.png

Bain & Company and Red Hat (NYSE: RHT), the world’s leading provider of open source solutions,today released the results of joint research aimed at determining how deeply enterprises are committed to digital transformation and the benefits these enterprises are seeing. The research report, For Traditional Enterprises, the Path to Digital and the Role of Containers, surveyed nearly 450 U.S. executives, IT leaders and IT personnel across industries and found that businesses that recognize the potential for digital disruption are looking to new digital technologies – such as cloud computing and modern app development – to increase agility and deliver new services to customers while reducing costs. Yet, strategies and investments in digital transformation are still in their earliest stages.

For those survey respondents that have invested in digital, the technology and business results are compelling. Bain and Red Hat’s research demonstrates that those using new technologies to digitally transform their business experienced:

  • Increased market share. These enterprises are eight times more likely to have grown their market share, compared to those in the earliest stages of digital transformation.
  • Delivery of better products in a more timely fashion through increased adoption of emerging technologies – as much as three times faster than those in the earlier stages of digital transformation.
  • More streamlined development processes, more flexible infrastructure, faster time to market and reduced costs by using containers for application development.

Despite the hype, however, even the most advanced traditional enterprises surveyed still score well below start-ups and emerging enterprises that have embraced new technologies from inception (digital natives). According to the survey results, nearly 80 percent of traditional enterprises score below 65 on a 100-point scale that assesses how these organizations believe they are aligning digital technologies to achieve business outcomes. Ultimately, the report reveals that the degree of progress among respondents moving towards digital transformation varies widely, driven in part by business contexts, actual IT needs and overall attitudes towards technology. It also uncovers some common themes in the research.

As companies progress on their digital adoption journey, they typically invest in increasingly more sophisticated capabilities in support of their technology and business goals. The use of modern application and deployment platforms represents the next wave of digital maturity and is proving to be key in helping companies address their legacy applications and infrastructure.

Containers are one of the most high-profile of these development platforms and a technology that is helping to drive digital transformation within the enterprise. Containers are self-contained environments that allow users to package and isolate applications with their entire runtime dependencies – all of the files necessary to run on clustered, scale-out infrastructure. These capabilities make containers portable across many different environments, including public and private clouds.

While the opportunities created by these emerging technologies are compelling, the speed and path of adoption for containers is somewhat less apparent, according to the Bain and Red Hat report. The biggest hurdles standing in the way of widespread container use according to respondents are common among early stage technologies – lack of familiarity, talent gaps, hesitation to move from existing technology and immature ecosystems – and can often be overcome in time. Vendors are making progress to address more container-specific challenges, such as management tools, applicability across workloads, security and persistent storage, indicating decreasing barriers to adoption.

Read more

curata__IT5OmqydcY04M1K.png

Focus on Hybrid Cloud Is Opportune

| marketrealist.com
curata__IT5OmqydcY04M1K.png

Hybrid cloud technology provides its users with the opportunity to place their workloads into environments according to suitability. By combining public and private cloud services, users can maximize the benefits of cloud migration.

Why IBM’s Focus on Hybrid Cloud Is Opportune

Though the private cloud lags behind the public cloud in terms of adoption, it’s not far behind. In 2015, the hybrid and private cloud spaces registered 45% growth, compared to the 51% growth in the public cloud space.

This growth was evident in RightScale’s 2016 report, which stated that private cloud adoption had risen from 63% in 2015 to 77% in 2016. As more and more companies add private cloud technology, the likelihood of hybrid cloud adoption should continue to improve. The rise in private cloud adoption caused hybrid cloud adoption to rise 58% in 2015 to 71% in 2016 on a YoY (year-over-year) basis.

Scalability and flexibility are driving hybrid cloud adoption

According to Lifehacker, a recently conducted survey done by Microsoft (MSFT) that involved 1,200 information technology leaders across the Asia-Pacific (VPL) region, including Australia (EWA), showed that 40% of Australian organizations were either already using or readying themselves to use hybrid cloud solutions. This number is estimated to rise to 49% within next one to one-and-a-half years.

It’s the scalability and flexibility offered by the hybrid cloud that enables an enterprise to develop an automated, secure, and integrated computing environment in a cost-effective manner.

Hybrid cloud offers a combination of features of two or more cloud models—private, community, and public. The hybrid cloud provides the control and security of a private cloud along with the versatility, user friendliness, and cost-effectiveness of a public cloud. It’s preferred by organizations because even when using cloud services, they still have control over their data. Thus, hybrid cloud technology offers more control, cost structure efficiency, reduced risk, and better performance. These factors are likely a significant factor in the hybrid cloud’s increased adoption.

Read more

curata__H1A67t5K0mKtOHA.jpeg

HPE sees Synergy in hybrid cloud infrastructure

| PC World Australia
curata__H1A67t5K0mKtOHA.jpeg

HPE originally pitched its Synergy line of “composable” IT infrastructure as a way to bring the flexibility of cloud services to on-premises systems. Now it’s turning that story around, putting those same Synergy components — and some new ones — into the public cloud with the goal of simplifying hybrid IT management.

The new components of Synergy made their debut in London on Tuesday, at HPE Discover, an event for the company’s customers and partners.

Among the new offerings are a software update for the HPE Hyper Converged 380 server, and a new version of HPE Helion CloudSystem. Both incorporate new cloud management functions intended to simplify the automation of repetitive tasks. There are also two new ways to pay for it all, HPE Dynamic Usage for Hyper Converged Systems, and HPE Flexible Capacity Service, and some deft financial engineering to move some of the business risks onto partners.

HPE’s goal is to allow IT departments to act as service providers for their organization, rather than maintaining infrastructure.

“The most efficient way to deal with the cloud point of view is to take it from an application perspective, start with the workload and derive the infrastructure to support it,” said Matt Foley, HPE’s director of cloud presales in Europe, the Middle East and Africa.

Read more

curata__935BBer3ecZAVHP.jpeg

Rackspace Achieves AWS Premier Consulting Partner Status in the AWS Partner Network

| Marketwired
curata__935BBer3ecZAVHP.jpeg

Rackspace® today announced that it has achieved Premier Partner status as a Consulting Partner within the Amazon Web Services® (AWS) Partner Network (APN). This designation is the highest level in the APN, recognizing APN Partners that have made significant investments to develop the technical resources and AWS expertise necessary to deploy and manage customer solutions on the AWS Cloud. Customers can tap into this valuable expertise through Fanatical Support® for AWS architects and engineers, who collectively hold more than 600 AWS professional and associate certifications across the globe.

To qualify for the AWS Premier Consulting Partner tier, Partners must meet requirements that demonstrate the scale of their AWS expertise, capabilities and engagement in the AWS Ecosystem. Rackspace has been an APN Advanced Consulting Partner since the launch of Fanatical Support for AWS in October 2015. Its global base of knowledge spans all five AWS technical certifications, including Solutions Architect Associate, Developer Associate, SysOps Administrator Associate, DevOps Engineer Professional, and Solutions Architect Professional. Rackspace has also demonstrated expertise across different types of AWS workloads and has achieved AWS Competencies for DevOps and Marketing & Commerce.

“We are proud to be recognized as a Premier Consulting Partner in the APN,” said Jeff Cotten, senior vice president of AWS at Rackspace. “Since the launch of Fanatical Support for AWS in late 2015, we have been focused on helping our customers maximize the value of their AWS investments by developing our expertise and capabilities on AWS. Our team has worked so hard to achieve Premier Partner status in such a short time, and we look forward to continuing to build on our ability to provide Fanatical Support for AWS to our customers.”

Read more

curata__BM0mQtBrwzT8R4R.jpeg

Debunking the multi-cloud myths

| Information Age
curata__BM0mQtBrwzT8R4R.jpeg

As cloud adoption continues to grow, so has organisations’ use of multi-cloud. But as this technology has emerged so have some myths surrounding it.

A recent report from Rightscale has revealed that businesses are now using an average of six separate clouds.

Salesforce offered cloud as a service back in 1999. The technology has been around for awhile, despite many believing it is still a fairly recent innovation.

The majority of businesses are using the cloud to improve agility and flexibility across operations.

Such is the benefit that the cloud can bring, businesses are turning to using more than one more frequently: a multi-cloud.

Multi-cloud is an environment where applications are deployed across two or more cloud platforms.

It is evident that businesses do benefit from its use (and the cloud in general) with higher performance and cost efficiency capabilities by choosing a configuration of cloud platforms and technologies tailored to suit their needs.

As this ‘newish’ trend of multi-cloud gains popularity there are still those organisations that choose not to use it based on the challenges they think adopting multi-cloud will bring.

Myths have started to emerge around what businesses will find challenging about multi-cloud services, which is preventing CIOs from understanding what the multi-cloud is, or how it can benefit the wider business.

The myths

“Multi-cloud has for too long been the sleeping giant of the cloud computing world, with many IT leaders misinterpreting its meaning and therefore believing it doesn’t exist – even within their own organisations.”

“With businesses often ending up employing multi-cloud by accident due other departments employing cloud services without their knowledge, it is crucial that multi-cloud is understood and managed. If it is left for too long, it can cause headaches further down the line in terms of security and compliance”

You need to be a big business to really benefit from multi-cloud

The myth that multi-cloud is only for big businesses is perpetuated, quite rightly, by the assumption that in bigger organisations there is a multitude of differing opinions on what employees want to use.

It is therefore a natural evolution for them to have more than one cloud service. With this comes the benefits of a multi-vendor strategy, such as cost savings, more innovation and risk management.

Such benefits are however, not just for the big players.

Using multi-cloud is less secure

More clouds, more problems? This myth centres on the fact that, with the increasing complexity of multiple clouds, comes a greater risk of security issues.

But this is not necessarily true if well managed. Instead, it’s worth looking at it from the opposing view – how can a multi-cloud strategy help you be more secure and compliant.

Isn’t it the same as hybrid cloud?

Not at all. Multi-cloud helps to describe an increasingly common architecture and typically implies several key distinctions from the other commonly used term, ‘hybrid cloud’.

Although some commentators and analysts still use the terms interchangeably, hybrid cloud is actually a specific type of multi-cloud architecture.

The technical expertise required for multi-cloud is the biggest barrier to having a comprehensive strategy

It’s true that learning the ins and outs of the infrastructure and lingo of more than one cloud can be challenging, especially for a smaller company.

Meanwhile, bigger companies are faced with tremendous competition to retain the specialised engineers and architects who are versed in multi-cloud, meaning that even they often struggle to keep the required skills.

However, this isn’t a burden that IT departments need to shoulder alone and it doesn’t need to get in the way of benefiting from multiple clouds.

A first step should be auditing specific cloud services employees are using.

Based on the outcome you can assess the level of expertise that already exists in the business and find out where the gaps are.

From there you can also look externally and determine whether or not you need cloud brokers or managed cloud providers to execute and manage it successfully.

This can remove the burden from teams, freeing up time to focus on activities that help drive the business forward.

Read more

curata__0hAGtoUMPnKYqGQ.jpeg

Docker and Canonical Partner on CS Docker Engine for Millions of Ubuntu Users

| Yahoo UK & Ireland Finance
curata__0hAGtoUMPnKYqGQ.jpeg

Docker and Canonical today announced an integrated Commercially Supported (CS) Docker Engine offering on Ubuntu, providing Canonical customers with a single path for support of the Ubuntu operating system and CS Docker Engine in enterprise Docker operations.

This commercial agreement provides for a streamlined operations and support experience for joint customers. Stable, maintained releases of Docker will be published and updated by Docker, Inc., as snap packages on Ubuntu, enabling direct access to the Docker, Inc. build of Docker for all Ubuntu users. Canonical will provide Level 1 and Level 2 technical support for CS Docker Engine backed by Docker, Inc. providing Level 3 support. Canonical will ensure global availability of secure and Ubuntu images on Docker Hub.

Ubuntu is widely used as a devops platform in container-centric environments. “The combination of Ubuntu and Docker is popular for scale-out container operations, and this agreement ensures that our joint user base has the fastest and easiest path to production for CS Docker Engine devops,” said John Zannos, Vice President of Cloud Alliances and Business Development, Canonical.

CS Docker Engine is a software subscription to Docker’s flagship product backed by business day and business critical support. CS Docker Engine includes orchestration capabilities that enable an operator to define a declarative state for the distributed applications running across a cluster of nodes, based on a decentralized model that allows each Engine to be a uniform building block in a self-organizing, self-healing distributed system.

 

Read the source article at Yahoo UK & Ireland Finance

Read more