Data Efficiency in the News


Ubuntu OpenStack, Ceph Come to ARM Servers


ARM officials took a step forward in their effort to build the software ecosystem around its efforts in the data center when Canonical announced that its Ubuntu OpenStack and Ceph offerings are now commercially available on servers powered by ARM’s 64-bit chip architecture.

Officials with both companies made the announcement Oct. 17, giving ARM more support in its strategy to become the primary alternative to Intel’s x86-based processors in data center systems. Canonical officials said there is increasing demand from users of its open-source Ubuntu cloud and storage software for more options in the data center hardware they’re running. The Ubuntu Linux operating system already runs on the ARM architecture.

“We have seen our [telecommunications] and enterprise customers start to radically depart from traditional server design to innovative platform architectures for scale-out compute and storage,” Mark Baker, product manager for OpenStack at Canonical, said in a statement. “In partnering with ARM, we bring more innovation and platform choice to the marketplace.”

Baker said the “next generation of scale-out applications are causing our customers to completely revisit compute and storage architectures with a focus on scale and automation. The ARM and Canonical ecosystems offer more choice in data center solutions with a range of products that can be optimized to run standard server software and the next generation of applications.”

The focus of the new Ubuntu effort will be on scale-out computing environments in the data center and cloud. The two companies will work with Ubuntu certified system-on-a-chip (SoC) companies, OEMs and original-design manufacturers (ODMs) to encourage the development of production-grade servers, storage platforms and networking gear that run on the 64-bit ARMv8-A architecture and are offered with Ubuntu Advantage support, officials said.

In a statement, Lakshmi Mandyam, senior marketing director of ARM’s server program, said the chip designer wanted to make sure to have “the best OpenStack and Ceph storage solutions and enterprise-grade support available. The commercial availability of Ubuntu OpenStack and Ceph is another milestone that demonstrates open-source software on ARM is ready for deployment now. The ARM and Canonical ecosystems can now simply write once and deploy anywhere on ARM-based servers.”

“I’m comfortable with where we are at this point,” Mandyam said last year. “There are a lot of proofs-of-concept going on with ARM.”

The announcement by Canonical and ARM comes a week before the chip designer kicks off this year’s TechCon 2016 show in Santa Clara, California.

ARM officials have said they are confident that adoption will start to ramp starting next year. Hewlett Packard Enterprise, Dell, Lenovo and supercomputer maker Cray are among the OEMs that are testing ARM-based chips in servers.

Read more


Gartner Reveals Top Predictions for IT Organizations and Users in 2017 and Beyond

| Technology Research

Gartner, Inc. today revealed its top predictions for 2017 and beyond. Gartner’s top predictions for 2017 examine three fundamental effects of continued digital innovation: experience and engagement, business innovation, and the secondary effects that result from increased digital capabilities.

“Gartner’s top strategic predictions continue to offer a provocative look at what might happen in some of the most critical areas of technology evolution. At the core of future outcomes is the notion of digital disruption, which has moved from an infrequent inconvenience to a consistent stream of change that is redefining markets and entire industries,” said Daryl Plummer, managing vice president, chief of research and Gartner Fellow. “Last year, we said digital changes were coming fast. This year the acceleration continues and may cause secondary effects that have wide-ranging impact on people and technology.”

By 2020, 100 million consumers will shop in augmented reality.
The popularity of augmented reality (AR) applications, such as Pokémon GO, will help bring AR into the mainstream, prompting more retailers to incorporate it into the shopping experience. As mobile device usage becomes an ingrained behavior, further blurring the lines between the physical and digital worlds, brands and their retail partners will need to develop mechanisms to leverage this behavior to enhance the shopping experience. Using AR applications to layer digital information — text, images, video and audio — on top of the physical world, represents one such route to deeper engagement, both in-store and in other locations. For example, a consumer pointing the IKEA catalog app at a room in his home can “place” furniture where he’d like it to go. This real-world element differentiates AR apps from those offering virtual reality (VR).

By 2020, 30 percent of web browsing sessions will be done without a screen.

New audio-centric technologies, such as Google Home and Amazon’s Echo, are making access to dialogue-based information ubiquitous and spawning new platforms based on “voice-first” interactions. By eliminating the need to use ones’ hands and eyes for browsing, vocal interactions extend the utility of web sessions to contexts such as driving, cooking, walking, socializing, exercising and operating machinery. As a result, the share of waking hours devoid of instant access to online resources will approach zero.

By 2019, 20 percent of brands will abandon their mobile apps.

Many brands are finding that the level of adoption, customer engagement and return on investment (ROI) delivered by their mobile applications are significantly less than the expectations that underpinned their app investment. New approaches are emerging that have a lower barrier to discovery and install, and offer levels of engagement that approach those of applications at a fraction of the investment, support and marketing cost. Many companies will evaluate these experiences against their under-performing applications and opt to reduce their losses by allowing their apps to expire.  

By 2020, algorithms will positively alter the behavior of more than 1 billion global workers global workers.
Contextualization algorithms have advanced exponentially to include a variety of behavioral interventions such as psychology, social neuroscience and cognitive science. Human beings tend to be emotionally charged and factually drained, causing them to be irrational. Algorithms can positively alter that behavior by augmenting their intelligence with the large collective memory bank containing knowledge that has been socialized and put to the test. This will help workers “remember” anything or be informed of just-in-time knowledge that they have never even experienced, leaving them to objectively complete the task at hand but also to better appreciate life as it unveils. Use of algorithms can raise alarms of “creepiness,” however, when used to effect positive outcomes, it can bring about changes to multiple industries.

By 2022, a blockchain-based business will be worth $10 billion.
Blockchain technology is established as the next revolution in transaction recording. A blockchain ledger provides an immutable, shared view of all transactions between engaging parties. Parties can therefore immediately act on a committed blockchain record, secure in the knowledge that it cannot be changed. Any kind of value exchange can happen in minutes, not days. Blockchain applications can free up cash, reduce transaction costs, and accelerate business processes. While blockchain development is still immature, it is attracting product and capital investment.

By 2021, 20 percent of all activities an individual engages in will involve at least one the top-seven digital giants.
The current top-seven digital giants by revenue and market capitalization are Google, Apple, Facebook, Amazon, Baidu, Alibaba and Tencent. As the physical, financial and healthcare world becomes more digital, many of the activities an individual engages in will be connected. This convergence means that any activity could include one of the digital giants. Mobile apps, payment, smart agents (e.g., Amazon Alexa), and digital ecosystems (e.g., Apple HomeKit, WeChat Utility and City Services) will make the digital giants part of many of the activities we do.

Through 2019, every $1 enterprises invest in innovation will require an additional $7 in core execution.
For many enterprise, adopting a bimodal IT style to jump-start innovation has been a priority and critical first step. Close alignment of Mode 1 and 2 teams is crucial to the realization of the digital business goals. Unfortunately, the deployment costs of the Mode 2 “ideated solution” are not necessarily considered during ideation, and for most, the Mode 1 costs are not factored into the initial funding. Designing, implementing, integrating, operationalizing, and managing the ideated solution can be significantly more than the initial innovation costs. Thus, Gartner anticipates that for every $1 spent on the digital innovation/ideation phase, enterprises will spend on average $7 for deploying the solution.

Through 2020, IoT will increase data center storage demand by less than 3 percent.
The Internet of Things (IoT) has enormous potential for data generation across the roughly 21 billion endpoints expected to be in use in 2020. Of the roughly 900 exabytes worth of data center hard-disk drive (HDD) and solid-state drive (SSD) capacity forecast to ship in 2020, IoT discrete sensor storage will represent only 0.4 percent, with storage from multimedia sensors consuming another 2 percent, for a rounded total of 2.3 percent. This indicates that IoT can scale and deliver important data-driven business value and insight, while remaining manageable from a storage infrastructure standpoint. 

By 2022, IoT will save consumers and businesses $1 trillion a year in maintenance, services and consumables.
The IoT holds enormous promise in reducing the cost of maintenance and consumables. The challenge lies in providing a secure, robust implementation that can deliver savings over one or two decades, without driving management costs that absorb any savings made. This could be an inexpensive monitoring system based on simple sensors that report defining characteristics to analytical servers. The analytics are used to spot patterns in the fleet data, and recommend maintenance based on actual usage and condition, not based on elapsed time or estimated condition. At the other extreme, there is the rise of the digital twin. The digital twin captures near real-time data feeds from its sensor-enhanced real-world twin, and uses this along with other data sources (e.g., weather, historian data, algorithms, smart machine analysis) to update its simulation to reflect the physical state of the twin. 

By 2020, 40 percent of employees can cut their healthcare costs by wearing a fitness tracker.

Companies will increasingly appoint fitness program managers to work closely with human resource leaders to include fitness trackers in wellness programs as part of a broader employee engagement initiative. Healthcare providers can save lives and downstream costs by acting on the data from wearable fitness trackers that show health risks to the user. Wearables provide a wealth of data to be analyzed either in real-time or in retrospect with the potential for doctors and other healthcare professionals to have access to both contextual and historical information, if the patient agrees to share it. 

Gartner clients can read more in the report “Top Strategic Predictions for 2017 and Beyond: Surviving the Storm-Winds of Digital Disruption.”


Read more


VMware Decides to Move Data Center Cloud Over to AWS


VMware and Amazon Web Services, major IT players that often aren’t mentioned in the same sentence because many of their products compete in the same markets, revealed a new partnership Oct. 13 that will result in all of VMware’s cloud infrastructure software being hosted on AWS.

For the IT and software development sectors, the deal will mean VMware mainstays such as all its software-defined data center ware—vCenter, NSX, vSphere, VSAN and others—will run on AWS instead of VMware’s own cloud. Like any other cloud deployment, the partnership enables VMware to focus on developing its products and not have to deal with the issues around hosting them, which has never been its primary business.

VMware Cloud on AWS will be run, marketed and supported by VMware, like most typical cloud deployments. However, the service will be integrated with AWS’s own cloud portfolio, which provides computing, databases, analytics and a few different levels of storage, among other features.

VMware Cloud on AWS is a jointly architected service that represents a significant investment in engineering, operations, support and sales resources from both companies. It will run on dedicated AWS infrastructure.

Mark Lohmeyer, VMware’s vice president of products in the Cloud Platform Business Unit, listed the following as the key benefits of the new service:

Best-in-class hybrid cloud capabilities: Features enterprise-class application performance, reliability, availability and security with the VMware technologies optimized to run on AWS.

Operationally consistent with vSphere: With VMware Cloud on AWS, a private data center integrated with the AWS public cloud can be operated using the same vCenter UIs, APIs and CLIs that IT managers already know.

–Seamless integration with AWS services: Virtual machines running in this environment will have access to use AWS’ broad set of cloud-based services, including storage, database, analytics and more.

Seamless workload portability: Full VM compatibility and total workload portability between the data center and the AWS cloud is part of the deal.

Elastically scalable: The service will let users scale capacity according to their needs.  Capacity can be scaled up and down by adding or removing hosts.

No patching or upgrades: The service will remove the burden of managing the software patch, update and upgrade life cycle for the user.

Subscription-based consumption: Customers will be able to purchase dedicated clusters that combine VMware software and AWS infrastructure, either on-demand or as a subscription service.

Read more

Next Generation Data Storage Technologies Market Forecast Up to 2024


Next generation data storage technology includes technologically advanced data storage products and solutions to deal with increasing file sizes and huge amount of unstructured data. The next generation data storage technology manages large data securely and enables reliable, secure and fast recovery of data in a cost-efficient manner. It has enabled scalable storage and handling of large data generated by big enterprises.

The factors favoring the growth of the next generation storage technologies market include ubiquity of input and output devices in every sector and the ever-increasing need for managing, analyzing and storing huge amount of data. Consequently, the demand for next generation data technologies is expected to increase at a quick rate over the forecast period. This growth is expected to be backed by the growing demand for advanced time saving technologies including automated systems, smart technologies, online shopping, and internet of things etc. which require handling of large data generated by the enterprises.

There are various challenges restraining the growth of the next generation data storages technologies market. This includes technological complexity, repair and restore issues, lack of security etc. Furthermore, high level of data consistency is required in the data storage. Future growth in the market is projected to come from emerging need for data storage in small and medium enterprises.

The next generation data storage technologies market is segmented on the basis of technology and application. By technology, the market is classified as into all-flash storage arrays, hybrid array, cloud based disaster recovery, holographic data storage and heat assisted magnetic recording. Of these, hybrid array is a form of hierarchical storage management contains solid state drives and hard disk drives for input and output speed improvements. Holographic data storage is the high capacity data storage technology whereas hybrid array and all flash array are standard data storage techniques.

By application, next generation data storage technologies market is divided into the enterprise data storage, big data storage and cloud based storage.

North America is the dominating the next generation data storage technologies market. The Asian Pacific countries including China, Japan and India are expected to grow at a significant rate as compared to other regions. The presence of a large number of IT industries in the Asia Pacific region is one of the key factor driving growth of the next generation data storage technologies market in the region. Asia Pacific countries are speculated to make huge investments in the data storage sector to provide their existing infrastructures with new data storage technologies and solutions to improve the production process. Japan, which is one of the technology advanced nations, is anticipated to be a big market for next generation data storage technologies. The country is already using these data storage technology across its various industry verticals

Some of the key players in the next generation data technology market are Dell Inc., Avago Technologies, EMC Corporation, Hewlett-Packard Development Company, L.P., HGST, Inc., – Hitachi Data Systems, IBM Corporation, NetApp, Inc., Avago Technologies, Drobo, Inc. and Micron Technology Corporation.

Read more


WW Cloud IT Infrastructure Revenue Up 14.5% to $7.7 Billion in 2Q16 – IDC


According to the International Data Corporation‘s Worldwide Quarterly Cloud IT Infrastructure Tracker, vendor revenue from sales of infrastructure products (server, storage, and Ethernet switch) for cloud IT, including public and private cloud, grew by 14.5% year over year to $7.7 billion in 2Q16, ahead of renewed hyperscale growth expected in 2H16.

The overall share of cloud IT infrastructure sales climbed to 34.9% in 2Q16, up from 30.6% a year ago. Revenue from infrastructure sales to private cloud grew by 14.0% to $3.1 billion, and to public cloud by 14.9% to $4.6 billion. In comparison, revenue in the traditional (non-cloud) IT infrastructure segment decreased 6.1% year over year in the second quarter. Private cloud infrastructure growth was led by Ethernet switch at 49.4% year-over-year growth, followed by storage at 19.7%, and server at 8.9%. Public cloud growth was also led by Ethernet switch at 61.8% year-over-year growth, followed by server at 25.1% while storage revenue for public cloud declined 6.2% year over year. In traditional IT deployments, server declined the most (7.5% year over year) with Ethernet switch and storage declining 2.2% and 2.0%, respectively.

As expected, the hyperscale slow down continued in the second quarter of 2016,” said Kuba Stolarski, research director for computing platforms, IDC. “However, deployments to mid-tier and small cloud service providers showed strong growth, along with private cloud buildouts. In general, the second quarter did not have as difficult a compare to the prior year as the first quarter did, and this helped improve growth results across the board compared to last quarter. In 2H16, IDC expects to see strengthening in public cloud growth as key hyperscalers bring new datacenters online around the globe, continued strength in private cloud deployments, and declines in traditional, non-cloud deployments.”

Read more


Unified Deduplication at SNIAs Software Development Conference

| Source

Last week I attended the SNIA Software Development Conference (SDC) in Santa Clara, CA.  Some of the most interesting sessions included discussions about new erasure codes, new techniques for hardware accelerated data reduction, and several advancements developing storage stacks compatible with the latest technologies from Microsoft.  I led a session on Unified Deduplication at SDC. 

In a room packed to the brim with storage developers, my SDC talk focused on the differences between deduplication for backup and deduplication for primary storage.  Until recently, there weren’t any products that effectively addressed the needs of both markets.  In my presentation I discuss the core technologies behind deduplication, how workflow requirements differ between backup and primary use cases, and how to unify the two environments.  The gist of the presentation is that today’s backup software and special-purpose appliances are poorly suited to deliver deduplication for primary storage. Primary storage deduplication could be an outstanding solution for backup if it only had a mechanism for handling the data alignment patterns found in backup streams.  Because primary deduplication solutions, such as Permabit’s VDO device mapper target for Linux, generally handle fixed-sized chunk sizes (VDO uses 4 KB chunks), until now have not been unable to deliver deduplication benefits when dealing with unaligned streams of data.

With the introduction of the Optimizer file system capability in VDO6, unaligned streams of data can be deduplicated. Optimizer acts as a layered file system, sitting above the standard Linux file systems such as ext4 and xfs.  When it sees a familiar format, such as a tar or zip, it automatically segments the data on intelligent boundaries, padding it so that it is 4 KB aligned.  If the file type cannot be identified, a generic rolling hash algorithm is used to locate the boundaries and pad the output. The resulting data and associated 4 KB aligned metadata get written out into a file in the underlying file system.  When the 4 KB blocks hit the deduplication system below, the likelihood of locating matching blocks is increased improving the amount of data reduction. 

To demonstrate the benefits of Optimizer, I backed up my own personal 19.3 GB of Microsoft Office documents to a Linux system running VDO6 without Optimizer and saw 16% savings from compression. Then I backed up the same data to VDO6 system running Optimzer and saw impressive savings of 68%! 


When multiple full backup images are stored, the savings increase substantially.  A second full backup increases the savings to 84%, a third to 92%, a fourth to 96% and so on.  You can see the results summarized in the table below.  Even in environments with a high change rate, deduplication is a proven solution which yields 20:1 savings on backup in traditional business environments.


But the most compelling fact is that VDO6 doesn’t just solve the backup problem, it’s also able to address primary storage workloads as well.  The same modern server hardware can deliver up to  650,000 mixed random IOPS and backup rates of over 2 GB/s!  At those speeds, a single storage server could easily be addressing two workloads simultaneously, for example data warehousing and end-point protection (for desktops and mobile devices) could be run on the same systems.  That’s true flexibility for the  data center!

Read more


Worldwide converged systems market increased revenue 12.1% year over year to $2.9 billion during 2Q16.


According to the International Data Corporation‘s Worldwide Quarterly Converged Systems Tracker, the worldwide converged systems market increased revenue 12.1% year over year to $2.9 billion during 2Q16.

The market generated 1,693PB of new storage capacity shipments during the quarter, which was up 31.8% compared to the same period a year ago.

More and more, we are seeing end users looking beyond the hardware specs and performance metrics; end users in this market are looking for vendors that are adept at building and maintaining deep customer relationships,” said Kevin M. Permenter, senior research analyst, computing platforms. “Converged system vendors that are able to build these deep relationships with their customers are positioned for growth.

Converged Systems Segments
IDC distinguishes between four product categories: integrated systems, certified reference systems, integrated platforms, and hyperconverged systems.

  • Integrated systems are pre-integrated, vendor-certified systems containing server hardware, disk storage systems, networking equipment, and basic element/systems management software.
  • Like integrated systems, certified reference systems are pre-integrated, vendor-certified systems containing server hardware, disk storage systems, networking equipment, and basic element/systems management software. Certified reference systems, however, are designed with systems from multiple technology vendors.
  • Integrated platforms are integrated systems that are sold with additional pre-integrated packaged software and customized system engineering optimized to enable such functions as application development software, databases, testing, and integration tools.
  • Hyperconverged systems collapse core storage and compute functionality into a single, highly virtualized solution. A key characteristic of hyperconverged systems that differentiate these solutions from other integrated systems is their ability to provide all compute and storage functions through the same server-based resources.

During 2Q16, the combined integrated infrastructure and certified reference systems markets generated revenues of $1.6 billion, which represented a year-over-year increase of 4.7% and 54.8% of the total market revenue. EMC was the largest supplier of the combined certified reference systems and integrated infrastructure categories with $699.58 million in revenue, or 43.0% share of this market segment. All VCE product brands, which became the EMC converged platforms division, are reported under EMC starting with the first quarter of 2016 – all historical data from 2012 to 2015 will remain under the VCE vendor.

Integrated platform revenue declined 3.5% Y/Y during 2Q16, generating $864.09 million in revenues. This amounted to 29.1% of the total market revenue. Oracle was the top-ranked supplier of integrated platforms in the quarter, generating revenues of $481.97 million and capturing a 55.8% share of the category.

Hyperconverged sales grew 137.5% Y/Y during 2Q16, generating $480.62 million in revenue. This amounted to 16.2% of the total market revenue. The above hyperconverged systems revenues, like the other product categories listed here, exclude revenue from support and maintenance contracts.

Read more

Why NetApp’s Stock Is Worth $28


NetApp (NTAP) has seen weak demand for storage hardware over the last few years, corresponding to weak IT spend across the globe. NetApp’s storage product revenues have consistently fallen over the last four years, a trend observed by many large IT hardware, telecom hardware and storage hardware vendors. Competing storage systems manufacturers EMC (NYSE:EMC), Hewlett-Packard Enterprise (HPE), Hitachi Data Systems and IBM (IBM) have also witnessed low demand for storage hardware . As a result, storage systems manufacturers are shifting their focus to fast-growing market domains such as flash-based storage arrays or converged systems (which include servers, storage and networking equipment in one box) or software-defined storage to stay relevant. Moreover, it has become imperative for hardware vendors to enhance their focus on software solutions and post-sales hardware maintenance & services given that they are higher-margin businesses and have had high customer demand over the years.

Below we take a look at key growth drivers for the company that justify our $28 price estimate for NetApp , which is around 15-20% lower than the current market price. NetApp’s stock price is up by over 30% since the beginning of the year.

Storage vendors are increasingly facing competition from so-called White Box storage vendors. Over the last few years, customers are shifting preference to low-cost original design manufacturer (ODM) storage boxes, which is cutting into the addressable market for large vendors. As a result, NetApp’s share in the external storage systems market has fallen from over 13% in 2013 to 11.1% in 2015. This trend could continue in the coming years with smaller vendors gaining share from large manufacturers

Low product sales have led to discounted selling prices, which ultimately drove down product margins significantly. The adjusted gross margin for the product division has fallen from under 55.6% in 2011 to around 50.3% in 2015. This could further fall to around 47.3% in 2016.

In addition to driving the top line, the hardware maintenance and services division has also contributed positively to improving the company’s profitability. The product division’s gross margins fell by over 5 percentage points from 2011 through 2015 due to pricing pressure from smaller vendors. On the other hand, the services division’s gross margin improved by over 5 percentage points. In the long run, the services division could continue to become more profitable for the company as a large aggregate client base could lead to a higher refresh rate for maintenance contract renewals.

However, the sustained weakness in NetApp’s core product division and over-dependence on one revenue stream could be a risk going forward. As a result, we maintain our $28 price estimate for NetApp’s stock. You can modify the interactive charts in the article above to see how much the change in individual drivers such as gross margins or market share impacts the price estimate for NetApp’s stock.

Read more:

Read more


Avnet to Sell Technology Solutions Business Unit to Tech Data for $2.6 Billion


Avnet, Inc., a global technology distributor,  entered into an agreement to sell its Technology Solutions operating group to Tech Data Corporation in a stock and cash transaction valued at approximately $2.6 billion.

Under the terms of the agreement, Avnet will receive $2.4 billion in cash and 2.8 million shares of Tech Data common stock, currently valued at approximately $200 million.

The sale of this business provides both Avnet and Tech Data with opportunities to focus on core strategies and scale their respective businesses, ultimately delivering greater profitability to their shareholders.

We believe the acquisition of Technology Solutions by Tech Data is the best decision for our employees, customers, suppliers and shareholders. This transaction presents us with the best strategic path for Avnet’s future success and profitability, and puts Technology Solutions in position to achieve breakthrough business results with Tech Data,” said William Amelio, CEO, Avnet. “Moving forward, Avnet will focus its resources and investments on becoming a leader in design chain and supply chain services not only for our current customers and suppliers, but also for new markets. We will drive targeted investments in embedded solutions, Internet of Things (IOT) and critical digital platforms. By investing in these high growth areas, we can expand the breadth of our portfolio and attract new customers worldwide who depend on us to deliver world-class solutions.”

Avnet’s Technology Solutions operating group is an IT solutions distributor serving customers and suppliers in more than 80 countries. It provides next generation solutions, marketing, training, resources and services that span the cloud to the data center and encompass the entire IT lifecycle. They work with value-added resellers to make it easier and more affordable to enter and excel in high-growth technology and vertical markets locally and around the world.

This transformative transaction will position us as a premier global IT distributor with the most diverse end-to-end solutions from the data center to the living room,” said Bob Dutkowsky, CEO, Tech Data. “Tech Data has competed with and admired Avnet Technology Solutions for many years. We’re thrilled to start this journey together and are confident that our customers, vendor partners, employees, and shareholders will appreciate and benefit from the value that we will bring to the market. We look forward to welcoming the Technology Solutions team to Tech Data and are excited for the opportunities that this combination creates.”

Read more


IBM Power Systems and Red Hat Extend Collaboration for Next-Generation Cloud Platforms

| IBM - United States

IBM (NYSE: IBM) and Red Hat, Inc. (NYSE: RHT), a leading provider of open source solutions, today announced enhancements and growth in their long-standing alliance to better help clients embrace hybrid cloud. Through joint engineering and deeper product collaboration, the two companies plan to deliver solutions built on key components of Red Hat’s portfolio of open source products, including Red Hat Enterprise Linux, Red Hat Virtualization, and Red Hat Enterprise Linux High Availability offerings. This move will help position IBM Power Systems as a featured component of Red Hat’s hybrid cloud strategy spanning platform infrastructure located both on and off premises

IBM and Red Hat have a long tradition of innovation to advance product offerings across IBM platforms. Through expanded collaboration both in upstream technologies and product development, the companies aim to enable greater compatibility between their respective platforms, bringing Red Hat’s offerings to clients who previously worked in distributed environments. Specifically, IBM and Red Hat are working together to build functionality and jointly engineer solutions across IBM Power Systems and productized in the Red Hat portfolio by:

  • Enabling Red Hat solutions on IBM’s next-generation Power Systems;
  • Introducing new high performance computing (HPC) capabilities for leading edge research deployments;
  • Developing high availability capabilities for Power Systems.

“Red Hat believes that the next generation of applications and hybrid cloud services will be powered by modern, hyperscale hardware and software that span both public clouds, like IBM Cloud, and on-premise platforms,” said Jim Totton, vice president and general manager, Platforms Business Unit, Red Hat. “Red Hat and IBM are expanding their long-standing alliance to address this opportunity. Through joint engineering and deeper product collaboration, we are excited to deliver world-class solutions built on Red Hat’s portfolio of enterprise open source solutions and IBM’s Power Systems platform.”

“Clients choose open source capabilities to achieve new levels of agility and flexibility in their hybrid cloud environments, but they need access to optimal support,” said Scott Crowder, CTO, IBM Systems. “Clients have long turned to Red Hat and IBM to support their enterprise computing needs. Now, we are expanding that relationship with Red Hat to provide new systems designed for enterprise-grade open source solutions that go far beyond what commodity infrastructure has offered.”

Read the source article at IBM – United States

Read more