Data Efficiency in the News

curata__w6RbiCiIrczejre.png

Open source no longer scares the enterprise

| ComputerWeekly.com
curata__w6RbiCiIrczejre.png

Open source breaks the rules on corporate procurement, but developers never play by the rules and now open source has sneaked in through the back door.

A study by Vanson Bourne for Rackspace reports that businesses are making big savings by using open source.

In the survey of 300 organisations, three out of five respondents cited cost savings as the top benefit, reducing average cost per project by £30,146.

With most IT projects at the lower end of the cost scale, such savings are significant.

About half of the organisations in the study reported greater innovation because of open source – and 46% said they used open source because of the competitive opportunities.

In fact, 30% said open source gave them the opportunity to respond more quickly to market trends as a driver. Almost half (45%) said it enabled them to get products and services to market sooner – with project lifecycles reduced by an average of six months.

Businesses are used to dealing with the major IT providers and some open source software companies have successfully mimicked commercial models to sell open source to the enterprise.

“Red Hat sells the perception of decreased risk, and it looks similar to proprietary software sales,” said Lindberg. “This was what most corporate procurement people are used to.”

But he added that, in his experience, open source tended to creep in from the bottom through unofficial routes. “It doesn’t require permission or payment. You can simply start using it to deliver business value. It is about people just trying to be more efficient at doing their jobs.”

According to Lindberg, it is a huge indictment of the traditional commercial software business model that no one wants to use the very expensive software stacks that corporate IT used to deploy. “People say they can do things better, faster, cheaper and more efficiently using community-oriented open source software,” he said.

 

Read more

curata__fa7e71e0f6e616995fb720783b6d4480.PNG

3 Reasons Why An OpenStack Private Cloud May Cost You Less Than Amazon Web Services (AWS)

| forbes.com
curata__fa7e71e0f6e616995fb720783b6d4480.PNG

IT organizations are moving to the public cloud in droves to take advantage of cost savings and efficiency improvements over traditional on premise datacenters. The public cloud offers the promise of on-demand self-service for developers and business owners, pooling of resources to improve utilization and the ability to scale applications very quickly. Companies like Amazon.com , Google  and Microsoft  have developed robust public cloud solutions, strong developer communities and broad vendor ecosystem support for their offerings. These companies and other public cloud vendors are reaping the rewards of the mass exodus out of the datacenter and into the cloud.

One example of a private cloud solution gaining significant traction is OpenStack, which has become a de facto standard for open-source based private clouds. With the OpenStack Summit taking place in Barcelona this week, it is interesting time to reflect at how robust OpenStack has become since the project’s inception over 6 years ago. OpenStack is now backed by some of the world’s leading technology infrastructure providers like Cisco Systems , Dell ,EMC , Hewlett Packard Enterprise, IBM, Intel and Lenovo.

According to the latest OpenStack User Survey released last week, the share of OpenStack deployments in production is 20% greater than a year ago, with 71% of clouds in production or full operational use. In addition, the latest survey showed that 72% of those surveyed said their number one business driver for deploying OpenStack was to save money over alternative infrastructure choices. Many companies have already proven out the return on investment that OpenStack can provide. For example, TD Bank claims that they experienced a 25% to 40% costs savings on their platforms and virtual machines over their previous solution by deploying OpenStack.

As private cloud solutions like OpenStack become more widely adopted, now is the time for IT to take a hard look at why a private cloud approach may make more sense for some workloads than a wholesale move to a public cloud like Amazon Web Services (AWS). Here are three reasons to consider:

  1. Cost Models: Public cloud based pricing models are generally optimized for development workloads that have a lifespan of months, not years. The public cloud may also be well-suited for workloads that have choppy demand where IT may need the flexibility to scale up and down resources, while those workloads with linear demand may be better served with private cloud. In addition, many organizations find that the network bandwidth costs for public clouds can add up quickly for high-traffic workloads. The specific breakeven point between public cloud and private cloud will vary depending on each environment. However, as IT organizations crunch the numbers for their bandwidth-intensive production workloads, private cloud often comes out on top.
  2. Flexibility: Long-term flexibility may be limited with the public cloud focused strategy. Over the next several years, many companies will look to adopt multi-cloud strategies that include a mix of private cloud and multiple public cloud options to ensure they have the “right cloud” for each of their workloads. It is important to consider how easy it may be in the future to move applications from one cloud to another and how locked in you may be to a specific public cloud. A strategy that is centered around a specific public cloud vendor’s tool stack may limit interoperability with other clouds and limit IT’s ability to move away from certain public cloud offerings as workload demands change. In addition, many IT organizations looking to move out of the public cloud are finding that it can be very costly to move applications and data from one cloud environment to another.
  3. “As-a-Service” Private Clouds: There are ways to get the efficiency benefits of public cloud without having to make the leap. The public cloud does provide a hands-off approach for managing IT resources which lets IT focus on more value-added activities to drive the business. Public cloud also provides operating expense based financing models which can be beneficial for those companies not looking to pay a large up front capital expense for equipment. With this in mind, vendors like Rackspace Hosting–>–> and Mirantis have come to market with solutions that provide private cloud capabilities “as- a-service”. By deploying private cloud as-a-service, IT can deploy their workloads on their premises or at a co-location facility which gives all of the benefits of a private cloud (data sovereignty, security, control) with a public cloud-like consumption model. Service offerings can also include capacity planning, cost monitoring, solution optimization and resource management for the entire product lifecycle. For the right workloads, private cloud as-a-service may cost less than the public cloud. Rackspace also offers an “OpenStack Everywhere” approach which gives IT choices on where they want to deploy OpenStack, whether it be their own on premise datacenter, a third-party datacenter, a colocation facility or a Rackspace datacenter.

 

IT organizations may want to think twice before making a move to the public cloud as the return on investment for certain workloads may be greater with a private cloud solution like OpenStack. But getting up and running in production on OpenStack isn’t always straightforward. It is only recently that OpenStack has reached a tipping point to be well-equipped for deployment by a broad base of IT organizations. Most IT organizations will want to engage with industry partners who have OpenStack expertise to do it right. Both existing and up-and-coming vendors help fill in the perceived gaps of the upstream code with products that help improve the OpenStack deployment process and the ongoing operational experience. Integrators and service providers are also delivering OpenStack consulting and support expertise to help enterprise users deploy and manage their OpenStack environments.

Read more

curata__rJuEzjHPFWFCM49.jpeg

​Where OpenStack cloud is today and where it’s going tomorrow

| ZDNet
curata__rJuEzjHPFWFCM49.jpeg

The future looks bright for OpenStack — according to 451 Research, OpenStack is growing rapidly to become a $5-billion-a-year cloud business. But obstacles still remain.

OpenStack Summit has 5,000-plus people who believe that OpenStack is the future of the cloud. 451 Research thinks they may be onto something, The research company expects revenue from OpenStack business models to exceed $5 billion by 2020 and grow at a 35 percent compound annual growth rate (CAGR).

451 observed that so far OpenStack-based revenue has been overwhelmingly from service providers offering multi-tenant Infrastructure-as-a-Service (IaaS). Looking ahead, though, 451 believes OpenStack’s future success will come from the private cloud space and in providing hybrid-cloud orchestration for public cloud integration. Better still, for OpenStack companies, 451 sees private cloud revenue exceeding public cloud by 2019.

451 Research also predicted that OpenStack will grow across software-defined networking (SDN), network function virtualization (NFV), mobile, and Internet of Things (IoT) for both service providers and enterprises. This is in addition to its existing use cases in big data and lines of business. The keynotes, again, supported this conclusion. Representatives from Huawei, NEC, and Nokia all sang OpenStack’s praises in business and telecom.

“This year OpenStack has become a top priority and credible cloud option, but it still has its shortcomings,” said Al Sadowski, 451 Research’s research VP. For example, while OpenStack is still growing in popularity for enterprises interested in deploying private cloud-native applications, its appeal is limited for legacy applications and for companies that are already comfortable with AWS or Microsoft Azure.

In addition, while several marquee enterprises, such as Wal-Mart, use OpenStack as the central component of cloud transformations, others are still leery of the perceived complexity associated with configuring, deploying, and maintaining OpenStack-based architectures.

They’re not wrong. OpenStack is still difficult to deploy, That’s why companies such as Red Hat, Canonical, HPE, and Mirantis are making a living from OpenStack distributions and OpenStack integration respectively. As ZDNet editor Larry Dignan pointed out in a recent article, systems integrators still have a role to play in the cloud.

Read more

information-week

Hyperconverged Infrastructure Is Now A Data Center Mainstay

| informationweek.com
information-week

Hyperconverged infrastructure, where networking, compute, and storage are assembled in a commodity hardware box and virtualized together, is no longer the odd man out. Compared with converged infrastructure — a hardware oriented combination of networking and compute — hyperconverged brings three data center elements together in a virtualized environment.

Hyperconverged infrastructure at one time was criticized as overkill and as handing off too many configuration decisions to a single manufacturer. But IT managers and CIOs have abandoned that critique as more and more hyperconverged units are integrated into the data center with minimal configuration headaches and operational setbacks.

The 451 Research Voice of the Enterprise found that 40% of enterprises now use hyperconverged units as a standard building block in the data center, and analysts expect that number to climb rapidly over the next two years.

For that 40% of users: “74.4% of organizations currently using hyperconverged are using the solutions in their core or central datacenters, signaling this transition,” according to the report.

Christian Perry, research manager at 451 and lead author of the report, wrote that “loyalties to traditional, standalone servers are diminishing in today’s IT ecosystems as managers adopt innovative technologies that eliminate multiple pain points.”

For large enterprises of 10,000 employees or more, 41.3% reported that they were planning to change their IT staff makeup as a result of hyperconvergence. Over a third — 35.5% — of enterprises responded that they had added more virtual machine specialists due to the adoption converged systems.

According to the authors, “This is more than double the number of organizations actively adding specialists in hardware-specific areas” (such as server administrators or storage and network managers).

One area, however, remains surprisingly unchanged.

Containers have yet to make a major appearance in the infrastructure’s makeup, and “remain nascent,” in Perry’s phrase, in data center management. Nearly 51% reported that none of their servers were running containers, while 22.3% told analysts that they are running containers on 10% or fewer of their x86 servers.

The 451 researchers don’t expect those low percentages to last.

IT staffs will eventually take advantage of “their lightweight nature” to further adoption of the DevOps IT model and frequent software updates. But such an adoption will require personnel, perhaps the same virtualization managers, being added to staff at a high rate to manage the technology, the report noted.

VMware for one is attempting to include container management inside its more general, vSphere virtual machine management system.

Read more

curata__eQmJBcu1GxwzdhX.png

Permabit and AHA Partnership Improves Data Center CAPEX and OPEX

| cbs8.com
curata__eQmJBcu1GxwzdhX.png

Permabit Technology Corporation, leaders in data reduction, and AHA Products Group (AHA) today announced a technology partnership that will enable hyperscale data centers to reduce CAPEX/OPEX, increase performance, increase storage capacity, and extend the life-cycle of their flash memory storage systems.

Many new applications generate petabytes of ephemeral data requiring compression rates of 40 Gbps or more.  In these applications, CPU overhead from data compression can lead to significant reductions in performance and increases in CAPEX and OPEX.  Through this partnership, Permabit’s HIOPS Compression® will enable customers to utilize AHA’s GZIP Compression/Decompression Accelerators to further extend performance and efficiency.

When comparing a 20 core server performing only GZIP compression or decompression, the AHA374 GZIP accelerator simultaneously provides 8X compression and 2X decompression throughput. Compared to LZO/LZ4 data compression, the AHA374 GZIP accelerator increases storage capacity by almost 50%.

“As our 3rd generation data compression product, we’ve focused on high quality hardware and plug-and-play ZLIB/GZIP software libraries. Combined with our first class customer support, integration is straightforward and painless for the end customer,” said Jeff Hannon, VP of AHA Engineering. “We are thrilled to add Permabit’s HIOPS Compression to our list of integrators that includes KX systems and  Velocimetrics, among many others.”

“Permabit and AHA are working together to ensure that VDO, the only complete data reduction solution for the Linux storage stack, works seamlessly with AHA GZIP compression accelerators to maximize compression rates and increase data center density,” said Louis Imershein, VP Product for Permabit and author of the Data Efficiency magazine on Flipboard. “By integrating HIOPS Compression with AHA Accelerators we are continuing to expand Permabit’s VDO software in the Original Design Manufacturer (ODM) market segment beyond our OEM and OS (open-source) implementations. This is another example of the flexibility, breadth and depth of Permabit Albireo data reduction capabilities.”

 

Read more

curata__9e826c88ef75d6ed788ba0801b6cee9e.PNG

Red Hat Named a Visionary in Gartner’s 2016 Magic Quadrant for Distributed File Systems and Object Storage

| Business Wire
curata__9e826c88ef75d6ed788ba0801b6cee9e.PNG

Red Hat, Inc. (NYSE: RHT), the world’s leading provider of open source solutions, today announced that Gartner, Inc. has positioned Red Hat– in the “Visionaries” quadrant of Gartner’s October 2016 Magic Quadrant for Distributed File Systems and Object Storage for its storage solutions – Red Hat Ceph Storage and Red Hat Gluster Storage

Gartner’s Magic Quadrants are based on rigorous analysis of a vendor’s completeness of vision and ability to execute. This is the first year Gartner has published a Magic Quadrant for Distributed File Systems and Object Storage.

Red Hat Ceph Storage and Red Hat Gluster Storage are both highly agile solutions that are built to handle different types of demanding workloads. Red Hat Ceph Storage is a robust, unified storage platform designed for object storage and cloud infrastructures at petabyte-scale. Red Hat Gluster Storage is a powerful distributed file system that offers flexible storage services across the datacenter’s footprints – bare metal, virtual machines, private clouds, public clouds, and containerized environments. Both offerings can be deployed on industry-standard hardware. They are supported by an active community of open source developers and partners and used by companies around the world.

Read more

curata__CjU6hbj3N92tiw6.jpeg

Ubuntu OpenStack, Ceph Come to ARM Servers

| eweek.com
curata__CjU6hbj3N92tiw6.jpeg

ARM officials took a step forward in their effort to build the software ecosystem around its efforts in the data center when Canonical announced that its Ubuntu OpenStack and Ceph offerings are now commercially available on servers powered by ARM’s 64-bit chip architecture.

Officials with both companies made the announcement Oct. 17, giving ARM more support in its strategy to become the primary alternative to Intel’s x86-based processors in data center systems. Canonical officials said there is increasing demand from users of its open-source Ubuntu cloud and storage software for more options in the data center hardware they’re running. The Ubuntu Linux operating system already runs on the ARM architecture.

“We have seen our [telecommunications] and enterprise customers start to radically depart from traditional server design to innovative platform architectures for scale-out compute and storage,” Mark Baker, product manager for OpenStack at Canonical, said in a statement. “In partnering with ARM, we bring more innovation and platform choice to the marketplace.”

Baker said the “next generation of scale-out applications are causing our customers to completely revisit compute and storage architectures with a focus on scale and automation. The ARM and Canonical ecosystems offer more choice in data center solutions with a range of products that can be optimized to run standard server software and the next generation of applications.”

The focus of the new Ubuntu effort will be on scale-out computing environments in the data center and cloud. The two companies will work with Ubuntu certified system-on-a-chip (SoC) companies, OEMs and original-design manufacturers (ODMs) to encourage the development of production-grade servers, storage platforms and networking gear that run on the 64-bit ARMv8-A architecture and are offered with Ubuntu Advantage support, officials said.

In a statement, Lakshmi Mandyam, senior marketing director of ARM’s server program, said the chip designer wanted to make sure to have “the best OpenStack and Ceph storage solutions and enterprise-grade support available. The commercial availability of Ubuntu OpenStack and Ceph is another milestone that demonstrates open-source software on ARM is ready for deployment now. The ARM and Canonical ecosystems can now simply write once and deploy anywhere on ARM-based servers.”

“I’m comfortable with where we are at this point,” Mandyam said last year. “There are a lot of proofs-of-concept going on with ARM.”

The announcement by Canonical and ARM comes a week before the chip designer kicks off this year’s TechCon 2016 show in Santa Clara, California.

ARM officials have said they are confident that adoption will start to ramp starting next year. Hewlett Packard Enterprise, Dell, Lenovo and supercomputer maker Cray are among the OEMs that are testing ARM-based chips in servers.

Read more

curata__9e826c88ef75d6ed788ba0801b6cee9e.PNG

Gartner Reveals Top Predictions for IT Organizations and Users in 2017 and Beyond

| Technology Research
curata__9e826c88ef75d6ed788ba0801b6cee9e.PNG

Gartner, Inc. today revealed its top predictions for 2017 and beyond. Gartner’s top predictions for 2017 examine three fundamental effects of continued digital innovation: experience and engagement, business innovation, and the secondary effects that result from increased digital capabilities.

“Gartner’s top strategic predictions continue to offer a provocative look at what might happen in some of the most critical areas of technology evolution. At the core of future outcomes is the notion of digital disruption, which has moved from an infrequent inconvenience to a consistent stream of change that is redefining markets and entire industries,” said Daryl Plummer, managing vice president, chief of research and Gartner Fellow. “Last year, we said digital changes were coming fast. This year the acceleration continues and may cause secondary effects that have wide-ranging impact on people and technology.”

By 2020, 100 million consumers will shop in augmented reality.
The popularity of augmented reality (AR) applications, such as Pokémon GO, will help bring AR into the mainstream, prompting more retailers to incorporate it into the shopping experience. As mobile device usage becomes an ingrained behavior, further blurring the lines between the physical and digital worlds, brands and their retail partners will need to develop mechanisms to leverage this behavior to enhance the shopping experience. Using AR applications to layer digital information — text, images, video and audio — on top of the physical world, represents one such route to deeper engagement, both in-store and in other locations. For example, a consumer pointing the IKEA catalog app at a room in his home can “place” furniture where he’d like it to go. This real-world element differentiates AR apps from those offering virtual reality (VR).

By 2020, 30 percent of web browsing sessions will be done without a screen.

New audio-centric technologies, such as Google Home and Amazon’s Echo, are making access to dialogue-based information ubiquitous and spawning new platforms based on “voice-first” interactions. By eliminating the need to use ones’ hands and eyes for browsing, vocal interactions extend the utility of web sessions to contexts such as driving, cooking, walking, socializing, exercising and operating machinery. As a result, the share of waking hours devoid of instant access to online resources will approach zero.

By 2019, 20 percent of brands will abandon their mobile apps.

Many brands are finding that the level of adoption, customer engagement and return on investment (ROI) delivered by their mobile applications are significantly less than the expectations that underpinned their app investment. New approaches are emerging that have a lower barrier to discovery and install, and offer levels of engagement that approach those of applications at a fraction of the investment, support and marketing cost. Many companies will evaluate these experiences against their under-performing applications and opt to reduce their losses by allowing their apps to expire.  

By 2020, algorithms will positively alter the behavior of more than 1 billion global workers global workers.
Contextualization algorithms have advanced exponentially to include a variety of behavioral interventions such as psychology, social neuroscience and cognitive science. Human beings tend to be emotionally charged and factually drained, causing them to be irrational. Algorithms can positively alter that behavior by augmenting their intelligence with the large collective memory bank containing knowledge that has been socialized and put to the test. This will help workers “remember” anything or be informed of just-in-time knowledge that they have never even experienced, leaving them to objectively complete the task at hand but also to better appreciate life as it unveils. Use of algorithms can raise alarms of “creepiness,” however, when used to effect positive outcomes, it can bring about changes to multiple industries.

By 2022, a blockchain-based business will be worth $10 billion.
Blockchain technology is established as the next revolution in transaction recording. A blockchain ledger provides an immutable, shared view of all transactions between engaging parties. Parties can therefore immediately act on a committed blockchain record, secure in the knowledge that it cannot be changed. Any kind of value exchange can happen in minutes, not days. Blockchain applications can free up cash, reduce transaction costs, and accelerate business processes. While blockchain development is still immature, it is attracting product and capital investment.

By 2021, 20 percent of all activities an individual engages in will involve at least one the top-seven digital giants.
The current top-seven digital giants by revenue and market capitalization are Google, Apple, Facebook, Amazon, Baidu, Alibaba and Tencent. As the physical, financial and healthcare world becomes more digital, many of the activities an individual engages in will be connected. This convergence means that any activity could include one of the digital giants. Mobile apps, payment, smart agents (e.g., Amazon Alexa), and digital ecosystems (e.g., Apple HomeKit, WeChat Utility and City Services) will make the digital giants part of many of the activities we do.

Through 2019, every $1 enterprises invest in innovation will require an additional $7 in core execution.
For many enterprise, adopting a bimodal IT style to jump-start innovation has been a priority and critical first step. Close alignment of Mode 1 and 2 teams is crucial to the realization of the digital business goals. Unfortunately, the deployment costs of the Mode 2 “ideated solution” are not necessarily considered during ideation, and for most, the Mode 1 costs are not factored into the initial funding. Designing, implementing, integrating, operationalizing, and managing the ideated solution can be significantly more than the initial innovation costs. Thus, Gartner anticipates that for every $1 spent on the digital innovation/ideation phase, enterprises will spend on average $7 for deploying the solution.

Through 2020, IoT will increase data center storage demand by less than 3 percent.
The Internet of Things (IoT) has enormous potential for data generation across the roughly 21 billion endpoints expected to be in use in 2020. Of the roughly 900 exabytes worth of data center hard-disk drive (HDD) and solid-state drive (SSD) capacity forecast to ship in 2020, IoT discrete sensor storage will represent only 0.4 percent, with storage from multimedia sensors consuming another 2 percent, for a rounded total of 2.3 percent. This indicates that IoT can scale and deliver important data-driven business value and insight, while remaining manageable from a storage infrastructure standpoint. 

By 2022, IoT will save consumers and businesses $1 trillion a year in maintenance, services and consumables.
The IoT holds enormous promise in reducing the cost of maintenance and consumables. The challenge lies in providing a secure, robust implementation that can deliver savings over one or two decades, without driving management costs that absorb any savings made. This could be an inexpensive monitoring system based on simple sensors that report defining characteristics to analytical servers. The analytics are used to spot patterns in the fleet data, and recommend maintenance based on actual usage and condition, not based on elapsed time or estimated condition. At the other extreme, there is the rise of the digital twin. The digital twin captures near real-time data feeds from its sensor-enhanced real-world twin, and uses this along with other data sources (e.g., weather, historian data, algorithms, smart machine analysis) to update its simulation to reflect the physical state of the twin. 

By 2020, 40 percent of employees can cut their healthcare costs by wearing a fitness tracker.

Companies will increasingly appoint fitness program managers to work closely with human resource leaders to include fitness trackers in wellness programs as part of a broader employee engagement initiative. Healthcare providers can save lives and downstream costs by acting on the data from wearable fitness trackers that show health risks to the user. Wearables provide a wealth of data to be analyzed either in real-time or in retrospect with the potential for doctors and other healthcare professionals to have access to both contextual and historical information, if the patient agrees to share it. 

Gartner clients can read more in the report “Top Strategic Predictions for 2017 and Beyond: Surviving the Storm-Winds of Digital Disruption.”

 

Read more

curata__pPdfh2H3mI0BZ4D.jpeg

VMware Decides to Move Data Center Cloud Over to AWS

| eweek.com
curata__pPdfh2H3mI0BZ4D.jpeg

VMware and Amazon Web Services, major IT players that often aren’t mentioned in the same sentence because many of their products compete in the same markets, revealed a new partnership Oct. 13 that will result in all of VMware’s cloud infrastructure software being hosted on AWS.

For the IT and software development sectors, the deal will mean VMware mainstays such as all its software-defined data center ware—vCenter, NSX, vSphere, VSAN and others—will run on AWS instead of VMware’s own cloud. Like any other cloud deployment, the partnership enables VMware to focus on developing its products and not have to deal with the issues around hosting them, which has never been its primary business.

VMware Cloud on AWS will be run, marketed and supported by VMware, like most typical cloud deployments. However, the service will be integrated with AWS’s own cloud portfolio, which provides computing, databases, analytics and a few different levels of storage, among other features.

VMware Cloud on AWS is a jointly architected service that represents a significant investment in engineering, operations, support and sales resources from both companies. It will run on dedicated AWS infrastructure.

Mark Lohmeyer, VMware’s vice president of products in the Cloud Platform Business Unit, listed the following as the key benefits of the new service:

Best-in-class hybrid cloud capabilities: Features enterprise-class application performance, reliability, availability and security with the VMware technologies optimized to run on AWS.

Operationally consistent with vSphere: With VMware Cloud on AWS, a private data center integrated with the AWS public cloud can be operated using the same vCenter UIs, APIs and CLIs that IT managers already know.

–Seamless integration with AWS services: Virtual machines running in this environment will have access to use AWS’ broad set of cloud-based services, including storage, database, analytics and more.

Seamless workload portability: Full VM compatibility and total workload portability between the data center and the AWS cloud is part of the deal.

Elastically scalable: The service will let users scale capacity according to their needs.  Capacity can be scaled up and down by adding or removing hosts.

No patching or upgrades: The service will remove the burden of managing the software patch, update and upgrade life cycle for the user.

Subscription-based consumption: Customers will be able to purchase dedicated clusters that combine VMware software and AWS infrastructure, either on-demand or as a subscription service.

Read more

Next Generation Data Storage Technologies Market Forecast Up to 2024

| openPR.com

Next generation data storage technology includes technologically advanced data storage products and solutions to deal with increasing file sizes and huge amount of unstructured data. The next generation data storage technology manages large data securely and enables reliable, secure and fast recovery of data in a cost-efficient manner. It has enabled scalable storage and handling of large data generated by big enterprises.

The factors favoring the growth of the next generation storage technologies market include ubiquity of input and output devices in every sector and the ever-increasing need for managing, analyzing and storing huge amount of data. Consequently, the demand for next generation data technologies is expected to increase at a quick rate over the forecast period. This growth is expected to be backed by the growing demand for advanced time saving technologies including automated systems, smart technologies, online shopping, and internet of things etc. which require handling of large data generated by the enterprises.

There are various challenges restraining the growth of the next generation data storages technologies market. This includes technological complexity, repair and restore issues, lack of security etc. Furthermore, high level of data consistency is required in the data storage. Future growth in the market is projected to come from emerging need for data storage in small and medium enterprises.

The next generation data storage technologies market is segmented on the basis of technology and application. By technology, the market is classified as into all-flash storage arrays, hybrid array, cloud based disaster recovery, holographic data storage and heat assisted magnetic recording. Of these, hybrid array is a form of hierarchical storage management contains solid state drives and hard disk drives for input and output speed improvements. Holographic data storage is the high capacity data storage technology whereas hybrid array and all flash array are standard data storage techniques.

By application, next generation data storage technologies market is divided into the enterprise data storage, big data storage and cloud based storage.

North America is the dominating the next generation data storage technologies market. The Asian Pacific countries including China, Japan and India are expected to grow at a significant rate as compared to other regions. The presence of a large number of IT industries in the Asia Pacific region is one of the key factor driving growth of the next generation data storage technologies market in the region. Asia Pacific countries are speculated to make huge investments in the data storage sector to provide their existing infrastructures with new data storage technologies and solutions to improve the production process. Japan, which is one of the technology advanced nations, is anticipated to be a big market for next generation data storage technologies. The country is already using these data storage technology across its various industry verticals

Some of the key players in the next generation data technology market are Dell Inc., Avago Technologies, EMC Corporation, Hewlett-Packard Development Company, L.P., HGST, Inc., – Hitachi Data Systems, IBM Corporation, NetApp, Inc., Avago Technologies, Drobo, Inc. and Micron Technology Corporation.

Read more