Data Efficiency in the News

curata__y2pHu8tQc8G5vlb.jpeg

Global Data Center Storage Market Is Expected to Reach US$ 29 Billion 2021

| SBWire
curata__y2pHu8tQc8G5vlb.jpeg

The growth of data center storage market worldwide is very interesting because, it is not only growing but also witnessing paradigm shift in segment and sub segment. Increasing internet penetration, multiple devices for professional and personnel uses have led to the growth of exponential growth of data. The regulatory requirement to keep data by enterprises for longer period of time led to enhance the size of the data center storage. There are several innovations being carried out by the vendors to address the storage challenges such as security, power consumptions by the devices, processing, scalability, and compatibility among different vendor’s data center equipment.

The analysts at Publisher has analysed a dual demand and supply growth in the data center storage market. The analysts expect the data center storage market to reach almost US$48.6 in 2021 at a CAGR of 14.4%. Most of the growth is expected to come from the SAN followed by the NAS segment. The report also covers a detailed analysis of the market size and growth forecast of SAN and NAS segment.

Data Center Storage Market- Trends, Drivers and Restraints
The most emerging trend in data center is the use of flash storage, also known as SSD storage. the market will experience more adoption of flash and traditional storage systems, where flash is likely to add more revenue in the upcoming years. Both storage systems have their own advantages and disadvantages when it comes to implementation. Vendors in the storage market are offering hybrid SAN-NAS solutions, through which business can merge block- and file-based data onto a common array. Increased use of Software Data Center Storage (SDDC) and the initiations of using combined SAN-NAS systems are few of the emerging trends mentioned in details in the report. In addition the emergence of ubiquitous data reduction technologies will make the data center denser and more efficient.

Furthermore, the boom of big data analytics has triggered widespread technological progress in the data center storage arena. It has generated the demand of storage devices to store data and servers to operate real-time applications. The aggressive approaches the SMEs and SMBs and convince them to move from on premise to cloud-based storage have, which led to create a huge demand for data center storage. Data center down time is one of the critical challenges every data center faces. It comes in the form of data center outage due to power shortage, outage due to seismic activity, outage due to fluid leak, or even security threats.

Data Center Storage Market- Geographic Analysis
The report includes the market analysis of different regions such as North America, Latin America, APAC, Europe and MEA. The report outlines the major market share holder and the market size analysis of all the regions. North American market has shown significant growth in shipment of SAN products compared to other geographies. The reason being, almost 30% of large data centers are upgrading their storage requirement for next 10 years, in which they are adding almost 3-4 timer more storage capacity than they have currently available. APAC accounts for more than 28% market share, and is expected to be the market leader by 2021 with a market share of almost 34%. This significant growth in market share is primarily due to increasing construction of new data center in SEA and China such as Google data centers in Singapore, Microsoft data center in India. SAN are the major revenue contributor in this regions.

Mostly, the newly constructed data center in Nordic Region is contributing the growth of IT equipment in Europe. Nordic has emerged as the most preferred place for new data center construction due to the free cooling technique and low OPEX. Latin America and MEA market will take another 3-4 years to show a significant growth in terms of both shipment and deployment.

Data Center Storage Market – Key Vendors and Market Share
This market research profiles the major companies in the Global Data Center Storage Market and also provides the competitive landscape and market share of key the players. The report covers the entire market outlook regarding the value chain operating within the market.

The Major Vendors include:
Dell EMC, Net APP, HP, IBM, Dell, Hitachi Data System.

The Emerging Vendors are:
Huawei, Fujitsu, Data Direct Network, Nimble Storage, NEC.

Other Prominent Vendors included in the report are:
American Megatrends, Lenovo, Nfina, Nimbus Data, Overland Storage, Oracle, Pure Storage, Promise Technology, Quanta Computer, Netgear, Tegile, Tintri, Toshiba, Violin Memory, X-IO Technologies, Supermicro.

Read more

curata__0829fcdbe39ad1fd38e48b6c0a10a2e2.PNG

Permabit pulls on Red Hat, opens arms for a Linux cuddle

| The Register
curata__0829fcdbe39ad1fd38e48b6c0a10a2e2.PNG

Crimson headcover kernel gets dedupe and compression

The Mad Hatter of Linux is getting Alice in Wonderland style physical space virtualisation with thin provisioning, compression and deduplication courtesy of Permabit.

Building on its June 2016 Linux availability announcement, Permabit and Red Hat have a go-to-market partnership based on the former’s Albireo deduplication and HIOPS compression technology being added as a kernel module to Red Hat Linux. Up until now dedupe and compression have largely been storage array features, and then appearing in software-only storage running on servers and direct-attached disks, SSDs or JBODs.

Against that background Permabit has had somewhat limited success as a supplier to OEMs of its Virtual Data Optimizer (VDO) dedupe and compression technology, potential customers largely preferring to build their own dedupe tech. Its most prominent OEM is probably HDS for file storage, via its BlueArc acquisition. Now that RHEL, via Permabit’s VDO, has its own kernel-level dedupe and compression that means any attached storage can get the benefit of it.

Permabit CEO Tom Cook is especially keen on the COLO angle here. Take a cloud service provider or general colocation operator fitting up their facility with racks of Linux-running servers and storage trays. If they can somehow reduce their storage capacity by, say 25 per cent for a year, and then 25 per cent for the next year and so on, then that removes a significant slug of cost from their annual budgets; that’s the way Cook sees it and he has spreadsheet models and charts to backup his case.

Here’s a chart for a Linux Ceph storage setup, assuming a 2.5:1 data reduction rate and suggesting savings of $370,000 over 5 years with Permabit data reduction installed:

Permabit_Linux_Ceph_savings

Permabit’s VDO runs anywhere RHEL runs – in physical servers, in virtual ones and in the public cloud – and enables Red Hat to compete against suppliers of deduping server operating systems, virtual server/storage systems, OpenStack and deduping storage arrays, according to Permabit. It typically provides 2.5:1 data reduction for unstructured data and up to 10:1 reduction for VM images.

VDO works with Ceph and Gluster and it’s payable via a subscription license starting at $199/year for 16TB. It’s available through Permabit resellers and system integrators. ®

Read more

curata__UZX7wxbGO40NI9r.png

Microsoft Fortifies Commitment to Open Source, Becomes Linux Foundation Platinum Member

| Home | The Linux Foundation
curata__UZX7wxbGO40NI9r.png

The Linux Foundation, the nonprofit advancing professional open source management for mass collaboration, today announced that Microsoft has joined the organization at a Platinum member during Microsoft’s Connect(); developer event in New York.

From cloud computing and networking to gaming, Microsoft has steadily increased its engagement in open source projects and communities. The company is currently a leading open source contributor on GitHub and earlier this year announced several milestones that indicate the scope of its commitment to open source development. The company released the open source .NET Core 1.0; partnered with Canonical to bring Ubuntu to Windows 10; worked with FreeBSD to release an image for Azure; and after acquiring Xamarin, Microsoft open sourced its software development kit. In addition, Microsoft works with companies like Red Hat, SUSE and others to support their solutions in its products.

“As a cloud platform company we aim to help developers achieve more using the platforms and languages they know,” said Scott Guthrie, Executive Vice President, Microsoft Cloud and Enterprise Group. “The Linux Foundation is home not only to Linux, but many of the community’s most innovative open source projects. We are excited to join The Linux Foundation and partner with the community to help developers capitalize on the shift to intelligent cloud and mobile experiences.”

Microsoft already contributes to several Linux Foundation projects, including Node.js Foundation, OpenDaylight, Open Container Initiative, R Consortium and Open API Initiative.

John Gossman, Architect on the Microsoft Azure team, will join The Linux Foundation Board of Directors.

“Microsoft has grown and matured in its use of and contributions to open source technology,” said Jim Zemlin, Executive Director of The Linux Foundation. “The company has become an enthusiastic supporter of Linux and of open source and a very active member of many important projects. Membership is an important step for Microsoft, but also for the open source community at large, which stands to benefit from the company’s expanding range of contributions.”

To view a full roster of Linux Foundation members, please visit http://www.linuxfoundation.org/about/members.

About The Linux Foundation

The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at www.linuxfoundation.org.

Read more

curata__kt4Wdywr67xX1JP.jpeg

On-Premises Can Be Just as Effective as On-Cloud

| itbusinessedge.com
curata__kt4Wdywr67xX1JP.jpeg

There are plenty of good reasons to push the enterprise infrastructure model onto the cloud. It is cheaper, more flexible, and is becoming increasingly more secure and reliable as both hardware and software architectures evolve.

But there are plenty of good reasons to maintain a certain amount of on-premises infrastructure as well, and as time goes by, the deployment and management of that infrastructure will become vastly less complicated and expensive than what you find in today’s bloated data center.

Even Amazon is starting to rethink the all-cloud strategy that has driven its system development over the years. The company undoubtedly still thinks it provides the optimum environment for the bulk of enterprise workloads, but with releases like the recent Amazon Linux AMI (Amazon Machine Image), it also recognizes that some infrastructure will remain in the enterprise, at least for the time being. The system, according to E-Commerce Times, allows users to deploy the Linux AMI on internal resources, essentially providing tighter integration between local infrastructure and Amazon data centers. In this way, even high-performance applications can maintain a stable and secure link across hybrid deployments which, according to Amazon, would be most useful when developing and testing new apps and workloads at home before deploying them in the cloud.

In addition, there is the ability to deploy data reduction in either or bot on-premises or the cloud with Linux. Simply employ Permabit VDO for Data Centers in both and you can reduce data costs, increase data density and minimize network bandwidth consumption.

Whether on-cloud or on-premises, however, the final call will be made by the enterprise, which will now have to take into consideration its workload and application requirements, cost factors, performance goals, and a host of other elements for each and every deployment.

 

Whether on-cloud or on-premises, however, the final call will be made by the enterprise, which will now have to take into consideration its workload and application requirements, cost factors, performance goals, and a host of other elements for each and every deployment.

Read more

curata__enVFBp0vTZYpfUg.jpeg

Rackspace Announces Support for Red Hat Ceph Storage 2

| Marketwired
curata__enVFBp0vTZYpfUg.jpeg

Rackspace today announced support for Red Hat Ceph Storage 2, expanding its Rackspace Private Cloud powered by Red Hat porftfolio of services. This new version from Red Hat is based on the 10.2 Jewel community version release of Ceph and addresses key stability requirements for enterprise customers.

Data storage growth is both a reality and a burden for today’s enterprises. With the move to digital and data intensive workloads, enterprises are storing an increasing amount of data.

With Rackspace Private Cloud powered by Red Hat support for the Red Hat Ceph Storage 2 release, customers can gain the performance benefits of software defined storage, without taking on the burden of having to deploy, operate, monitor and troubleshoot their OpenStack or OpenStack storage environment.

This new release is the most stable version yet of Ceph and provides numerous features and benefits to enterprise users including:

  • Exabyte-level scalability
  • Open APIs
  • Security
  • Reliability and availability
  • Multi-Datacenter Support
  • Performance
  • Cost effectiveness

In addition Permabit Data reduction can be deployed with Ceph to effectively reduction storage consumption and cost. The impact of data reduction can increase data density enabling more efficient use of Rackspace resources lowering costs for each customer deployment.

 

Read more

curata__s8dZYCNYdbPFJ5l.png

Permabit and Calsoft announce reseller agreement for storage and cloud solutions

| PR Newswire
curata__s8dZYCNYdbPFJ5l.png

Permabit Technology Corporation, the open source data reduction experts, and Calsoft Pvt. Ltd., a leading IT services company specializing in storage, networking, virtualization and cloud business verticals have entered into a reseller agreement for Permabit VDO.  The agreement will enable Calsoft to deliver industry-leading data reduction solutions via their consulting and systems integration services for storage and cloud solutions.

Calsoft will deploy Permabit VDO in its storage solutions for software-defined data centers and storage in addition to cloud solutions for public, private and hybrid clouds.

“We selected Permabit because it is a solid data reduction offering that spans OEM and ODM deployments in Linux. Permabit data reduction complements our existing efforts in storage, cloud development and integration services,” said Anupam Bhide, CEO of Calsoft Pvt. Ltd.  “With Permabit VDO we will be able to deploy across Red Hat, Ubuntu and CentOS based solutions enabling our clients to optimize data center efficiency, increasing data density while avoiding costly expansion.”

The leader in data reduction technology, Permabit Technology Corporation recently announced the latest release of its Virtual Data Optimizer (VDO) software, VDO 6 – the only modular data reduction solution available for the Linux block storage stack.  VDO delivers the company’s patented deduplication, HIOPS Compression™ and thin provisioning in a commercial software package for Linux for enterprise hybrid cloud data centers and cloud service providers.

“Calsoft excels in software development and the delivery of integration services for complex data center storage and cloud solutions,” said Tom Cook, Permabit CEO and President. “We are thrilled to welcome Calsoft to Permabit’s reseller network and are fully committed to their success.”

To learn more about Permabit VDO Data Reduction software visit:
http://permabit.com/products-overview/albireo-virtual-data-optimizer-vdo/

About Calsoft

Calsoft is a leading software product engineering Services Company specializing in StorageNetworkingVirtualization and Cloud business verticals. Calsoft provides End-to-End Product Development, Quality Assurance Sustenance, Solution Engineering and Professional Services expertise to assist customers in achieving their product development and business goals. For more information ,visit http://www.calsoftinc.com

Follow Calsoft on Twitter: https://twitter.com/CalsoftInc

And/or on LinkedIn: https://www.linkedin.com/company/calsoft

About Permabit:

Permabit pioneers the development of data reduction software that provides data deduplication, compression, and thin provisioning. Our innovative products enable customers to get to market quickly with solutions that cut effective cost, accelerate performance, and gain a competitive advantage. Just as server virtualization revolutionized the economics of compute, Permabit software is transforming the economics of storage today.

Permabit is headquartered in Cambridge, Massachusetts with operations in California, Korea and Japan. For more information, visit www.permabit.com.

Follow Permabit on Twitter: https://twitter.com/Permabit

And/or on LinkedIn: https://www.linkedin.com/company/permabit

Read more

curata__5cd092b74bfbc44b4220a3e182130728.PNG

Cloud adoption keeps moving ahead

| Computerworld
curata__5cd092b74bfbc44b4220a3e182130728.PNG

Companies continue their migration of both applications and computing infrastructure to the cloud at a steady pace. They have moved 45% of their applications and computing infrastructure to the cloud already, and they expect well over half of their IT environment to be cloud-based by 2018, according to a recent IDG Enterprise survey of 925 IT decision makers.

On average, IT decision-makers (ITDMs) plan to allocate more than a quarter of their total IT budgets to cloud spending, but organizations with fewer than 1,000 employees are making significantly different choices than larger enterprises in how they spend that money.

The four main drivers moving IT decision makers to cloud computing are:

  1. Lower total cost of ownership
  2. Replacing on-premise legacy systems
  3. Enabling business continuity
  4. Speed of development

That’s not to say there aren’t concerns about the move to cloud; ITDMs name the following as their top three worries in each of the major cloud models:

Public cloud

  • 43% Where data is stored
  • 41% Security
  • 21% Vendor lock-in

Private cloud

  • 24% Vendor lock-in
  • 22% Lack of appropriate skills
  • 21% Security concerns

Hybrid cloud

  • 24% Security
  • 19% Where data is stored
  • 18% Lack of appropriate

In addition the need to optimize public cloud, hybrid cloud and  private cloud data and its costs cannot be underscored.  Applying data reduction to a Linux implementation in any of these deployment models will drive down TCO and improve operating efficiency.  Data reduction can now be deployed in almost any Linux OS!

Read more

curata__f423c5616ffe8e5b903e1d5a3d306c5b.PNG

Top Priority for IT Investments: Improve Service to Quickly Meet Business Needs

| Stock Market
curata__f423c5616ffe8e5b903e1d5a3d306c5b.PNG

The research, conducted across 125 IT decision makers in the US, revealed the number one priority for IT investments: Improve service to quickly meet business needs. Reducing risk was the second major priority, and the third, as stated by respondents, was realizing higher levels of performance to support mission-critical applications.

“The findings of this research clearly indicate that the number one priority for IT decision makers is ensuring that IT becomes an enabler of business and not a hindrance,” said Joshua Yulish, president and CEO of TmaxSoft, Inc. “To this end, IT must provide open systems that afford greater flexibility and speed at a lower cost. Businesses are looking to not only improve service to respond to business needs, but innovate faster and realize higher levels of performance to support key objectives.”

Dave Lasseter, VP Power Systems Sales at Mainline, an IBM and TmaxSoft partner, added: “These findings mirror what we are seeing in the market today. IT must take the initiative in delivering solutions and services that support innovation and enable the business to adapt to changes in strategy, market conditions, and regulatory requirements.”

Key findings include:

  • The top priority among 24% of respondents was improving service to dynamically respond to business needs.
  • The second #1 priority was ensuring uptime, cited by 21%, and third was the need to reduce the administrative cost and burden by consolidating systems, cited by 19% of respondents.
  • The top-rated second priority for IT decision makers was reducing risk (identified by 21% of the sample), followed by realizing higher levels of performance to support mission-critical applications (18% of respondents).

Not only are open systems a requisite. There is also a need for Linux based data reduction that can deliver enterprise wide operating and storage efficiency. This will result in lower data bandwidth needs, improved data density (fewer servers and storage devices) and reduce data center footprint which will improve operating efficiency.

  

Read more

curata__MqHHqUqXkFiIa2T.png

OpenStack expands both its customer reach and deployments size

| ZDNet
curata__MqHHqUqXkFiIa2T.png

In 451 Research’s recent report on OpenStack adoption among enterprise private cloud users, they found that 72 percent of OpenStack-based clouds are between 1,000 and 10,000 cores and three-fourths choose OpenStack to increase operational efficiency and app deployment speed.

They also found that OpenStack is not just for large enterprises. Two-thirds of respondents (65 percent) are in organizations of between 1,000 and 10,000 employees.

The survey also uncovered that OpenStack-powered clouds have moved beyond small-scale deployments. Approximately 72 percent of OpenStack enterprise deployments are between 1,000 to 10,000 cores in size. Additionally, 5 percent of OpenStack clouds among enterprises top the 100,000-core mark. So, while OpenStack may be expanding its reach into smaller companies, it’s being used for larger deployments.

Curiously, OpenStack users are adopting containers at a faster rate than the rest of the enterprise market with 55 percent of OpenStack users also using containers, compared to 17 percent with other cloud users. What’s odd about this, as Mark Shuttleworth, founder of Canonical and Ubuntu, pointed out to me at an OpenStack Summit meeting, is OpenStack is not especially well-suited for containers.

Well, not yet anyway. But, it will be with companies both moving the technology forward and customers demanding it.

OpenStack is also moving along to real enterprise workloads rather than just testing and development work. These include infrastructure services (66 percent), business applications and big data (60 percent and 59 percent, respectively), and web services and ecommerce (57 percent).

You’ll also find OpenStack clouds running in a wide variety of businesses. While 20 percent cited the technology industry, manufacturing (15 percent), retail/hospitality (11 percent), professional services (10 percent), healthcare (7 percent), insurance (6 percent), transportation (5 percent), communications/media (5 percent), wholesale trade (5 percent), energy and utilities (4 percent), education (3 percent), financial services (3 percent), and government (3 percent) were all represented.

Why are so many businesses across so many industries adopting OpenStack? Simple. Increasing operational efficiency and accelerating innovation/deployment speed are top business drivers for enterprise adoption of OpenStack, at 76 and 75 percent, respectively. Supporting DevOps is a close second, at 69 percent. Reducing cost and standardizing on OpenStack APIs were close behind, at 50 and 45 percent, respectively. In addition to operational efficiency, data efficiency is also becoming table stakes in today’s data center. Not just to reduce effective storage costs but also to increase data density which reduces /eliminates data center expansion. The bottom line is that openstack and its efficiency impact  help business bottom line and that’s why the adoption is increasing.

“Our research in aggregate indicates enterprises globally are moving beyond using OpenStack for science projects and basic test and development to workloads that impact the bottom line,” said Al Sadowski, 451 Research’s research vice president. “This is supported by our OpenStack Market Monitor which projects an overall market size of over $5 billion in 2020 with APAC, namely China, leading the way in terms of growth.”

Mark Collier, COO of the OpenStack Foundation, agreed, “The research [is] telling us that OpenStack is not merely an interesting technology, but it’s a cornerstone technology. Companies are using OpenStack to do work that matters to their businesses, and they’re using it to support their journey to a changing landscape in which rapid development and deployment of software is the primary means of competitive advantage.”

Read more