Data Efficiency in the News


Hybrid Cloud Services Market Growing at 19.8% CAGR during 2016-2020

| Press release distribution

Tue Aug 23 2016 The report, Global Hybrid Cloud Services Market 2016-2020, has been prepared based on an in-depth market analysis with inputs from industry experts. Hybrid cloud covers both private/onsite and public/offsite cloud resources. In this cloud, virtual machines, workloads, and applications operate seamlessly across different IT environments private clouds located in data centers of enterprises or service provider centers as well as external public clouds. Hybrid clouds integrate storage, compute, security, applications, networking, and management all into a common platform.

The analyst forecast global hybrid cloud services market to grow at a CAGR of 19.8% during the period 2016-2020. One of the key trends for market growth will be vendors foraying into the market space. The exponential growth of cloud-based services has encouraged startups to enter in the cloud domain through the adoption of on-premise services. Earlier, companies faced rigorous challenges in terms of costs, monitoring, and migration to the cloud. However, by implementing hybrid cloud services, SMEs can still continue with their legacy applications with majority of the services hosted over the cloud through backup and high availability. As cloud services continue to mature, cloud management services face more advanced problems related to performance, availability, data latency, and costs. Now, a large number of vendors are shifting to hybrid solutions due to promising opportunity of these solutions, which is also driving product innovation.

According to the hybrid cloud services market report, rising dependency of mid-sized enterprises will be a key driver for market growth. Hybrid cloud offers a strategic way to consume IT solutions across on-premise and the cloud. These cloud services are increasingly gaining popularity among enterprises of all levels. Through hybrid cloud, companies can make use of best practices from traditional on-premise IT infrastructure that includes governance, management, and standardization regardless of location. It provides an extended platform combining public and private clouds with dedicated servers to deliver seamless performance in a customized manner. It helps reap maximum benefits with enhanced performance from each component, thereby allowing companies to focus on core businesses.

Key players in the global hybrid cloud services market: AWS, Microsoft, Rackspace, and VMware. Other Prominent Vendors in the market are: Avnet, BMC Software, CA Technologies, Cherwell Software, Cisco, Dimension Data, Hornbill Services, HP, IBM, LANDesk, Mulesoft¸ NTRGlobal, Oracle, and Unisys.

Further, the report states that latency in data centers will be a challenge for the market. Latency in data transmission during retrieval from the public cloud infrastructure has long been a problem for cloud users. Most of the time, data are stored in remote locations in the cloud infrastructure. The data need to be fetched and transmitted to end-user location, which sometimes may create a jitter in data availability. Therefore, enterprises store those data in the public cloud infrastructure that are not used frequently. Public cloud service providers have enabled data caching to facilitate faster availability of the data to end-users.

Read more


JetStor Storage System Powered by Raidix V4.4


Raidix LLC has unveiled a edition of its storage system, version 4.4.

The solution further improves performance of standard corporate procedures, such as database access or transactional operations. Raidix is a performance storage technology that builds on patented erasure coding methods and innovations of the in-house research lab. The company commits to resilient storage with high throughput, performance and low latency. Firm-based solutions are employed in enterprise, media and entertainment, video surveillance, HPC and other data-rich industries.

What is new in version 4.4?

  • Random Access Optimization (RAO)
    RAO is a new feature that allows shiny performance gains and infrastructure savings for enterprise customers by using data deduplication. RAO may be applied to any particular volume. The functionality caters to random operations, such as database and transactional interactions. It enables fast resolution of business tasks and boosts data processing from enterprise applications (CRM, ERP, corporate email, etc.).

  • Technology builds on
    Data deduplication for space economy and easy virtualization; thin provisioning of system resources to extend logical disk capacity

  • Advanced redundancy
    Version 4.4 revamps multi-path input/output (MPIO) by adding support for the built-in Microsoft DSM (Device Specific Module) instead of previously used in-house DSM driver.

The company provides standalone storage appliances as well as scale-out NAS/shared storage solutions. The scale-out edition scales exponentially while maintaining a single namespace. The system supports heterogeneous client OS via SAN and shares the same data via NAS. It provides compatibility with third-party software and operates without a hitch on a multitude of hardware configurations.

Read more

Nimble Storage Broadens Support to OpenStack Community With Mirantis Certification

| Stock Market

SAN JOSE, CA — (Marketwired) — 07/25/16 — Nimble Storage (NYSE: NMBL), the leader in predictive flash storage, today announced an expansion of its commitment to the cloud by achieving the Mirantis Unlocked validation for its Nimble Cinder Driver, providing easy-to-deploy, scalable and highly available storage options for customers using OpenStack infrastructure with the Nimble Predictive Flash platform.

Integration of the Nimble Cinder Driver with Mirantis OpenStack allows customers to deploy OpenStack clouds with fast and reliable access to data. The combination of Mirantis OpenStack and the Nimble Predictive Flash platform delivers the foundation for enterprise customers to radically simplify cloud infrastructure deployments while meeting the demands of a dynamic and expanding user base. The Nimble Predictive Flash platform combines the power of InfoSight Predictive Analytics with a Unified Flash Fabric consolidation architecture to deliver flash performance and unparalleled availability by predicting and preventing issues.

SAN JOSE, CA — (Marketwired) — 07/25/16 — Nimble Storage (NYSE: NMBL), the leader in predictive flash storage, today announced an expansion of its commitment to the cloud by achieving the Mirantis Unlocked validation for its Nimble Cinder Driver, providing easy-to-deploy, scalable and highly available storage options for customers using OpenStack infrastructure with the Nimble Predictive Flash platform.

Integration of the Nimble Cinder Driver with Mirantis OpenStack allows customers to deploy OpenStack clouds with fast and reliable access to data. The combination of Mirantis OpenStack and the Nimble Predictive Flash platform delivers the foundation for enterprise customers to radically simplify cloud infrastructure deployments while meeting the demands of a dynamic and expanding user base. The Nimble Predictive Flash platform combines the power of InfoSight Predictive Analytics with a Unified Flash Fabric consolidation architecture to deliver flash performance and unparalleled availability by predicting and preventing issues.

Read more


32% CAGR to 2020 Reaching $7 billion for Software-Defined Storage Market


The global software-defined storage (SDS) market 2016-2020 report says rise of OpenStack will be a key trend for market growth as OpenStack open source cloud computing platforms, deployed in the form of Infrastructure as a Service (IaaS), help organizations manage their storage workloads in data centers.

These are designed to control a large pool of storage, compute, and networking resources in data centers through OpenStack APIs. Networking resources are managed through a dashboard that gives administrators control while empowering their users to provision resources through a web interface.

Global SDS market to surpass $7 billion by 2020.

The growth of this market is spurred by the effective management of unstructured data. Analytics solutions, when integrated with SDS solution for big data management, reduce costs and boost business agility. The integration of big data with network file systems and rapid provisioning of analytics applications streamlines the management of unstructured data for business intelligence.

uring 2015, the Americas accounted for around 55% of the overall market share to dominate the global SDS market. The rising demand for innovative IT architecture, fluctuating traffic patterns in networking infrastructure, and the rise of mobility technologies will fuel the growth of the SDS market in the Americas during the forecast period.

The analyst forecast global SDS market to grow at a CAGR of 31.62% during the period 2016-2020. According to the report, one of the key drivers for market growth will be cost reduction and efficiency.

Software-defined technology is poised to disrupt the traditional enterprise IT infrastructure model. Companies are under immense pressure to replace legacy IT infrastructure with innovative models that can cut costs. SDS provides a lean business model and minimizes costs by automating process controls and replacing traditional hardware with software.

Banking, financial services and insurance (BFSI) segment accounted for around 18% of the overall market revenue to become the key revenue generating vertical in the software defined storage market globally. The use of SDS in a BFSI environment gives analysts sufficient time to plan and manage their data to comply with evolving government regulations. SDS also provides storage enhancement through new application offerings. It enables efficient storage allocation through well-defined governing policies and eases the process of access provision through well-defined security policies.

The following companies are the key players in the SDS market: EMC, HP, IBM, and VMware.

Other prominent vendors in the market are 6Wind, Arista Networks, Avaya, Big Switch Networks, Brocade, Cisco, Citrix, DataCore, Dell, Ericsson, Fujitsu, HDS, Juniper Networks, NEC, NetApp, Nexenta, Nutanix, Pertino, Pivot3, Plexxi and SwiftStack.

Read more



| Webopedia

Hyper-convergence (or hyperconvergence) is a type of infrastructure system that is largely software-defined with tightly-integrated computestoragenetworking and virtualizationresources. This stands in contrast to a traditional converged infrastructure, where each of these resources is typically handled by a discrete component that serves a singular purpose.

Benefits of Hyper-Converged Infrastructure Systems

Hyper-converged infrastructure systems are designed to offer the following benefits:


Vendors of Hyper-Converged Infrastructure Systems

Some of the well-known established hyper-converged infrastructure vendors include Nutanix, SimpliVity and Scale Computing. VMware entered the hyper-converged infrastructure market recently as well, with the launch of its EVO:RAIL and EVO:RACK hyperconverged offerings, which respectively serve as an all-in-one virtualizationsolution and a full-scale software-defined data center (SDDC) in a box.

Hyper-Converged System Vendors

Hyper-converged systems are typically served in a commodity hardware box supported by a single vendor. Some companies offer additional technologies in their hyper-converged infrastructure systems beyond compute, storage and networking resources such as data deduplication and compression and WAN optimization.


Read more


3 critical questions to ask about hybrid cloud

| AT&T Networking Exchange

Cloud is no longer the future of computing. It’s here. And it’s rapidly changing the way you and other companies do business.

One of the trends that stands out to me is the movement to a hybrid cloud model. A hybrid cloud model uses multiple clouds for service – private, as a mix of on premise and data center supported, and public environments, supported by third parties.

Industry statistics seem to support this trend, too. According to a February 2016 survey by Tech Pro Research, 36 percent of the enterprises that responded stated that they currently use a hybrid cloud model. Another 32 percent were considering it.

In its FutureScape: Worldwide Cloud 2016 Predictions, IDC projected that 80 percent of enterprises will commit to a hybrid cloud model by 2017. The report also predicted that by 2018, 65 percent of companies’ IT assets will be located offsite in colocation, hosting, and cloud data centers.

Hybrid clouds offer many advantages, among them agility and flexibility.  Enterprises can manage their workload on an application-by-application basis. This enables them to plan for a growing mobile workforce and allot resources to Internet of Things (IoT) applications and big data analytics.

Concerns about the cloud

But when I talk with customers about hybrid clouds, two concerns come up time and again: security and control.

A lot of IT decision-makers have told me they have security concerns. They worry about accessing cloud services via the public internet, and they don’t want to share multi-tenant environments with competitors. They know how complex it can be to protect a network that includes many mobile and IoT devices.

They’re also wary of giving up direct control of their data, applications, and infrastructure. But managing traditional IT infrastructure in a data center can be costly—real estate costs, vendor accreditation, annual certifications, and more.  Not to mention, cloud-based services are becoming more differentiated. If you’re not using them, your company could be falling behind.

If you share these concerns, don’t scrap your hybrid cloud strategy. Instead, ask these three questions. The answers can help you understand what services you need to develop a successful hybrid cloud strategy and which providers may best support you.

1. What network and data security policies do our cloud providers offer?

Security should be top of mind.

Start with the technology and systems that protect the cloud infrastructure. Then, look at the encryption of data transmissions. See how it defends data in transit.

2. What performance objectives can our cloud providers meet?

Uptime statistics reveal the health of your provider’s cloud. Plus, many critical business systems, such as customer relationship management solutions, rely on fast back-and-forth processing. They may not run smoothly over the public Internet.

3. Can we give our mobile workers secure remote access to our corporate cloud?

An access solution that compromises the mobile experience may be as bad as one that undermines security. If your security controls are hard to navigate, people may work around them to save time.

AT&T uses leading network, technology, and service expertise to deliver IT infrastructure as a service anytime, virtually anywhere, quickly and efficiently.  We have invested in our data center business andAT&T Colocation services, allowing clients to extend their private networks, more securely and economically.

Read more


Software Defined Deduplication is Critical to the Cloud

| Storage Swiss

The goal of any cloud initiative is to create a cost-effective, flexible environment. The architectures will typically store large data sets for long periods of times, so one of the challenges to being cost-effective is the physical cost of storage. Deduplication is critical to extracting maximum value from a cloud first initiative but the cloud requires a different, more flexible software defined implementation.

Why We Still Need Deduplication?

While the cost per GB of hard disk and even flash storage continues to plummet, when purchased in the quantities needed to meet the typical cloud architectures capacity demands, storage continues to be the most expensive aspect of the design. And it’s not just the per GB cost, it is the physical space that each additional storage node consumes. Too many nodes can force the construction of a new data center, which is a much bigger cost concern than the price per GB of storage.

Deduplication provides a return on the investment by making sure that the architecture stores only the unique data. That not only reduces the capacity requirement it also reduces the physical storage footprint.

Organizations will have different cloud strategies. A few may only use the public cloud. Some may only use private cloud architectures. Most, however, will take a hybrid approach, leveraging the public cloud when it makes sense and a private cloud when performance or data retention concerns force them to. In the hybrid model data should flow seamlessly and frequently between public and private architectures.

If the same deduplication technology is implemented in both the hybrid and public cloud architectures then the technology’s understanding of data can be leveraged to limit the amount of data that has to be transferred, making the network connection between the two more efficient because only unique data segments would need to be transferred.


Storage costs may eventually get low enough that deduplication is, well, redundant. But that day is not any time soon. In addition, even if storage costs drop to that point deduplication will become obsolete. The greater density that a deduplicated storage node will achieve should reduce the physical footprint of the cloud storage cluster. A hybrid cloud model will also benefit from the network savings obtained by not transferring redundant data. Most critical though is that the technology be software defined so that it can provide the functionality regardless of hardware or location.

Read more


In The “Second Wave” Of Cloud Computing, Hybrid Cloud Is The Innovator’s Choice


In the cloud business, there’s plenty of “tech talk” about cloud technology.

I don’t discount its value, but my view of cloud is a little different because my job deals with clients’ success in adopting the cloud. As a result, I get a daily view of what CIOs in organizations, department leaders and other decision makers experience when adopting cloud technology.

Here’s what I hear from other clients:

  • They believe in the cloud so much that they’re literally betting their businesses on it.
  • They’ve decided a hybrid cloud strategy is the right approach.
  • They are thinking less about infrastructure and more about innovation and speed — basically, the “second wave” of cloud computing, which includes analytics for structured and unstructured data, as well as security capabilities.

They also understand that cloud isn’t a destination, but rather a platform for innovation. It’s where they can dream big, start small, experiment and scale when successful. In these organizations, CIOs and Chief Technology Officers become advocates of “the art of the possible.”

Hybrid is the palette they’re painting with, best expressed by the analysts at Frost and Sullivan.

“At their core, successful hybrid cloud strategies support the delivery of high-value applications and services to the business, while at the same time driving cost and inefficiency out of the IT infrastructure,” the study said.

Fine, but how does adopting a hybrid cloud strategy support business success, particularly as we enter the era of cognitive computing?

Successful organizations provide the answer. They aren’t adopting cloud technology for its own sake. Instead, they’re pursuing a business strategy that’s equally about transformation and industry disruption.

They believe that cloud and cognitive technology will cause changes in customers’ experiences, vastly improve business processes and operations, and improve insight and innovation across all aspects of their companies.

In this second wave of cloud, where hybrid is the strategy of choice, it’s no longer only about cheap computing and storage. Instead, cloud has become the platform for innovation and business value. It is the IT delivery model that impacts an entire organization.

Read more


Hybrid Cloud will Enable Streams of Data to Flow Freely

| David H Deans

The world of business is changing. Disruptive shifts in power are forcing everyone to question the established status quo. In particular, savvy chief executives are tasking IT organizations to help create compelling customer experiences, support new business models and adopt agile operational processes.

That’s why the vast majority of IT and business leaders are joining forces on the organizations’ most pressing commercial needs and wants. They seek an open and flexible business technology foundation that’s an enabler, rather than an inhibitor, to meaningful and substantive workflow progress.

Innovation: Opportunities vs Challenges

Meanwhile, CIOs are still compelled to manage the legacy IT infrastructure, while supporting new demands for flexible systems that enable digital innovation. Leading organizations are already blending traditional IT and cloud infrastructures to achieve better business outcomes.

They’re planning to adopt more new technology, such as cloud-enabled video services, enterprise mobility apps, social business, IoT and big data analytics. They’re very busy efficiently supporting current business objectives with existing infrastructure, and yet they must ensure that IT drives strategic value for the company.

What’s more, they’re using Hybrid Cloud to springboard to next generation activities that allow them to capture new markets. But there are many others that are still assessing their immediate needs, while considering all the options. Embracing cloud is a work-in-progress. The journey begins with an understanding of the basics.

Cloud Services: Enabler of Disruption

Cloud is now viewed as an impetus for innovation and collaboration. CEOs identify business technology as the number-one factor they see impacting the success of their organization. For these forward-thinking leaders, technology isn’t just part of the infrastructure needed to implement a business strategy. It’s what makes entirely new disruptive strategies possible.

And without that technology in place to spark continual innovation, CEOs fear being left behind. As a result, CIOs foresee a huge shift in their own roles and responsibilities, as they evolve from an IT service provider to a business innovation enabler.

Often, the private versus public cloud question is not viewed as an either/or decision. Informed organizations are frequently opting to utilize both platforms – what’s typically called an Open Hybrid Cloud services environment.

Hybrid Cloud: Pathway to the Future

Hybrid cloud computing dramatically increases your ability to create, deploy and integrate new digital services quickly — allowing your organization to keep pace with a shifting economic landscape and increasingly competitive marketplace.

We define hybrid cloud as the secure consumption and integration of services from two or more sources, including private cloud, public cloud and/or traditional IT infrastructure.

A hybrid cloud allows access to data, applications and services where they are most optimally placed — whether on a public cloud, private cloud or on-premise within an existing IT infrastructure, such as the traditional enterprise data center.

A hybrid cloud approach typically encompasses a wide range of choices including service providers, delivery configurations and billing models. It’s designed with the flexibility to change and integrate environments, data storage and services as needed.

Once implemented, a hybrid cloud offers many benefits, including the ability to compose, orchestrate and manage diverse IT workloads, thereby exploiting the portability of stored data assets and associated software applications.


Read more


On-premise Cloud: Bridging the gap: Hybrid Cloud – blurring the lines

| DailyFT

The public cloud is often seen as something that sits outside the enterprise. But its capabilities can be brought in-house

The benefits of cloud are now widely known; a faster time to market, streamlined processes and flexible infrastructure costs. The public cloud model also provides rapid access to business applications or shared physical infrastructure owned and operated by a third party. It is quick to provision, and the utility billing model employed in a public cloud means companies only pay for the services they use. And, with costs spread across a number of users, costs are kept under control.

This works especially well for certain business applications that support HR, Sales, Marketing and Support.  It is also ideal for training, development and testing – where there are sporadic bursts of activity.

Private cloud, on the other hand, offers a bespoke infrastructure dedicated to an individual business, either run on-premises or hosted within a data centre run by the cloud provider. This provides most of the benefits of the cloud – an agile, scalable and efficient cloud infrastructure – but with greater levels of control and security for the business than is offered by the public cloud, and as a result, often has a slightly higher level of cost.

A private cloud is often perceived to offer the best option for mission critical applications, or those that demand a higher level of customisation – something that can be more difficult to achieve in a public cloud environment. It can also reduce latency issues, as services are accessed over internal networks rather than the internet.

Bearing these factors in mind, a private cloud tends to work well for large and complex applications or organisations and those with stricter obligations around data and regulation.

Historically, customers have been faced with the dilemma of which model to use – public or private.  They’ve had to make a decision, one application at a time. This is mainly because public and private have had very different setups.

There has not been an ability to seamlessly pick up workloads and move them back and forth between the private and public cloud.  Each ‘burst’ or ‘cross-over’ from on-premise to on-cloud (or vice versa) requires different provisioning code, security profiles, network configurations, testing and automation tools. It’s just too difficult!

Fortunately, when considering the move to cloud, it doesn’t have to be an either/or decision anymore: hybrid cloud enables companies to utilise a mixture of both, and is giving organisations new strategic options. It is about providing the exact same infrastructure, security policies and toolsets, and, at the very last stage, choosing a deployment option – either on-premise or on-cloud.

One of the key benefits of operating a hybrid cloud is that it enables users to move applications and workloads between environments depending on demand, business needs and other variables. This mixed approach means businesses can rapidly respond to operational developments — for example using public cloud to quickly and cost-effectively develop and test new applications, before moving them back behind the firewall as they go into production.

It also means more (if not all) of a company’s applications are now ready to take advantage of the benefits of being deployed on a cloud – even if it’s the private cloud to start with.

This is now possible thanks to an evolution in the cloud computing space — the Public Cloud Machine — which uses the same software and hardware as the public cloud to bring the capabilities on-premise, meaning businesses can exploit the power of public cloud infrastructure while having the extra control that in-house data centres provide.

Essentially, it means organisations can address specific business or regulatory requirements, as well as those around data control and data location, while being able to tap into the perceived benefits of the public cloud: agility and pay-as-you go billing.

The hybrid cloud is set to become a business-as-usual expectation from companies.  Oracle is leading with the Public Cloud Machine, getting customers ahead of the curve.

By being able to blur the lines between where one cloud begins and another ends, companies can gain the ultimate flexibility of cloud, become more agile than their competitors and be in a better position to rapidly respond to changing needs and an increasingly competitive environment.

– See more at:–Bridging-the-gap–Hybrid-Cloud-%E2%80%93-blurring-the-lines-between-public-and-private-cloud-capabilities#sthash.647nUUS8.dpuf

Read more