Deploying VDO Data Reduction on Red Hat Atomic Host

| By: (56)

All of the buzz about containers is a bit surprising to many people who’ve watched operating system technology evolve over the years.  After all, many of the core concepts behind running isolated applications on a shared OS has been around on UNIX for over 20 years.  So what’s so exciting?  Well, to understand the container revolution you first have to look at Virtual Machines (VMs) and their impact on the…

Read more

curata__WlvD3H3qEzyulwo.png

Data Center Optimization: How to Do More Without More Money

| Data Center Knowledge
curata__WlvD3H3qEzyulwo.png

Data centers are pushing the boundaries of the possible, using new paradigms to operate efficiently in an environment that continually demands more power, more storage, more compute capacity… more everything. Operating efficiently and effectively in the land of “more” without more money requires increased data center optimization at all levels, including hardware and software, and even policies and procedures.

Although cloud computing, virtualization and hosted data centers are popular, most organizations still have at least part of their compute capacity in-house. According to a 451 Research survey of 1,200 IT professionals, 83 percent of North American enterprises maintain their own data centers. Only 17 percent have moved all IT operations to the cloud, and 49 percent use a hybrid model that integrates cloud or colocation hosts into their data center operations.

The same study says most data center budgets have remained stable, although the heavily regulated healthcare and finance sectors are increasing funding throughout data center operations. Among enterprises with growing budgets, most are investing in upgrades or retrofits to enable data center optimization and to support increased density.

As server density increases and the data center footprint shrinks, any gains may be taken up by the additional air handling and power equipment, including uninterruptable power supplies and power generators. In fact, data center energy usage is expected to increase by 81 percent by 2020, according to CIO magazine.

Often, identifying and decommissioning unused servers during a data center optimization project is a challenge, along with right-sizing provisioning.

Virtualization makes it easy to spin up resources as needed, but it also makes tracking those resources harder. The result is that unused servers may be running because no one is certain they’re not being used. A study by the Natural Resources Defense Council and Anthesis reports that up to 30 percent of servers are unused, but still running.

A similar principle extends to storage. While data deduplication (removing duplicate files) is widely used, over-crowded storage remains an issue for small to medium-sized enterprises (SMEs). Deduplication can free much-needed storage space.  For example, data deduplication along with compression can shrink data storage consumption by up to 85%.  This not only addresses the budget issues mentioned above but also helps with data density much like the server density mentioned earlier.  Imagine that you can save money with less storage and increase your data density at the same time .  Looks lie a win-win!

If data center optimization is concerned with saving money, managers also should examine their purchasing programs. NaviSite looked for cost efficiencies within volume projects and looked at large commodity items like cabinets, racks, cabling and plug strips eliminated middlemen whenever possible. For big purchases go directly to the manufacturers in China and seek innovative young technology vendors working with them to design specifications that significantly lower the price.

Data center optimization, clearly, extends beyond hardware to become a system-wide activity. It is the key to providing more power, more capacity and more storage without requiring more money.

* This article is quite long you may want to read the source article which can be found by clicking on the link below:

 

Read more

curata__RFiwG7fGsTts5rQ.jpeg

Enterprise Storage Extensive Growth Opportunities by 2026

| openPR.com
curata__RFiwG7fGsTts5rQ.jpeg

With the increased focus on virtualization and cost of operations; simplicity and convergence; and the cloud, enterprises are moving from traditional enterprise storage system to software-defined storage and cloud storage to provide cost effective real-time storage services. Therefore, it has been observed that traditional Enterprise Storage Systems market has declined over the past few years.

Most of the enterprises are implementing cloud based storage systems due to low cost and greater agility and it also observed that there are companies which follow the hybrid cloud strategy where traditional and cloud storage are used together. This approach fuel the demand for traditional enterprise storage system and cloud storage system where critical workloads can be managed securely.

Enterprises are seeking for more efficient storage systems, as increasing focus on digitization creates huge amount of data which fuel the demand for innovative storage solutions. It has been observed that smaller enterprises drive the cloud storage market and large enterprises drive hybrid approach storage.

A significant tool in containing storage costs in any cloud or hybrid cloud is the application of data reduction technology which can easily deploy in any cloud deployment. Permabit VDO delivers data reduction up to 85% in public, private or hybrid cloud.

Due to the rise in the volume of structured and unstructured data and the need to backup and archive the files at reduced costs also propel the market growth for enterprise storage systems.

By offering a better price and reducing infrastructure and management costs and providing the enhanced security features enterprise storage systems market witness with the growth in future.
Enterprise storage systems market is segmented on the basis of type of storage and regions.

Read more

curata__72Bb2tTnqExjWmD.jpeg

Data Reduction Technologies Reduce the Costs of Cloud Deployment

| Stock Market
curata__72Bb2tTnqExjWmD.jpeg

Enterprise IT organizations use cloud architectures to rapidly deploy resources and lower costs. By incorporating data reduction technologies in their architectures, organizations will accelerate deployment and reduce IT expenditures, say experts at Permabit Technology Corporation .

Data reduction is ideal for use by today’s enterprises choosing cloud-based deployments. With data reduction, organizations increase their agility and reduce costs since the technology reduces the footprint of data in transit and at rest. When data reduction is deployed at the operating system level, it is applicable to use in public cloud services or deploy in a company’s own private cloud.

“Organizations are under pressure to deliver digital transformation while reducing IT costs and are looking more and more to cloud as an answer,” said Tom Cook, Permabit CEO. “Our Virtual Data Optimizer (VDO) is the best and easiest way to deploy data reduction in every cloud deployment model.”

Permabit VDO provides the three key data reduction technologies needed to maximize storage savings, including: thin provisioning, data deduplication and compression. Implemented as a driver for the Linux device mapper, VDO operates at the same level in the Linux kernel as core data management services such as virtualization, data protection and encryption. VDO data reduction “just-works” regardless of whether the storage layers above are providing object-, block-, compute- or file-based access.

 

Read more

curata__kNl6y8gsttGHArN.png

Federal Agencies Optimize Data Centers by Focusing on Storage using Data Reduction

| fedtechmagazine.com
curata__kNl6y8gsttGHArN.png

In data centers, like any piece of real estate, every square foot matters.

“Any way we can consolidate, save space and save electricity, it’s a plus,” says the State Department’s Mark Benjapathmongkol, a division chief of the agency’s Enterprise Server Operation Centers.

In searching out those advantages, the State Department has begun investing in solid-state drives (SSDs), which provide improved performance while occupying substantially less space in data centers.

In one case, IT leaders replaced a disk storage system with SSDs and gained almost three racks worth of space, Benjapathmongkol says. Because SSDs are smaller and denser than hard disk drives (HDDs), IT staff don’t need to deploy extra hardware to meet speed requirements, resulting in massive space and energy savings.

Options for Simplifying Storage Management

Agencies can choose from multiple technology options to more effectively and efficiently manage their storage, says Greg Schulz, founder of independent analyst firm Server StorageIO. These options include: SSDs and cloud storage; storage features such as deduplication and compression, which eliminate redundancies and store data using less storage; and thin provisioning, which better utilizes available space, Schulz says.

Consider the Defense Information Systems Agency. During the past year, the combat support agency has modernized its storage environment by investing in SSDs. Across DISA’s nine data centers, about 80 percent of information is stored on SSD arrays and 20 percent is running on HDDs, says Ryan Ashley, DISA’s chief of storage.

SSDs have allowed the agency to replace every four 42U racks with a single 42U rack, resulting in 75 percent savings in floor space as well as reduced power and cooling costs, he says.

Deduplication Creates Efficiencies

Besides space savings and the fact that SSDs are faster than HDDs, SSDs bring additional storage efficiencies. This includes new management software that automates tasks, such as the provisioning of storage when new servers and applications are installed, Ashley says.

The management software also allows DISA to centrally manage storage across every data center. In the past, the agency used between four to eight instances of management software in individual data centers.

“It streamlines and simplifies management,” Ashley says. Automatic provisioning reduces human error and ensures the agency follows best practices, while central management eliminates the need for the storage team to switch from tool to tool, he says.

DISA also has deployed deduplication techniques to eliminate storing redundant copies of data. IT leaders recently upgraded the agency’s backup technology from a tape system to a disk-based virtual tape library. This type of approach can accelerate backup and recovery and reduce the amount of hardware needed for storage.

It also can lead to significant savings because DISA keeps backups for several weeks, meaning it often owns multiple copies of the same data. But thanks to deduplication efforts, the agency can store more than 140 petabytes of backup data with 14PB of hardware.

“It was a huge amount of floor space that we opened up by removing thousands of tapes,” says Jonathan Kuharske, DISA’s deputy of computing ecosystem.

Categorize Data to Go Cloud First

To comply with the government’s “Cloud First” edict, USAID began migrating to cloud services, including infrastructure and software services, about seven years ago.

Previously, USAID managed its own data centers and tiered its storage. But the agency moved its data to cloud storage three years ago, Gowen says, allowing USAID to provide reliable, cost-effective IT services to its 12,000 employees across the world. The agency, which declined to offer specific return on investment data, currently uses a dozen cloud providers.

“We carefully categorize our data and find service providers that can meet those categories,” says Gowen, noting categories include availability and security. “They just take care of things at an affordable cost.”

For its public-facing websites, the agency uses a cloud provider that has a content distribution network and can scale to handle sudden spikes in traffic.

In late 2013, a typhoon lashed the Philippines, killing at least 10,000 people. In the days following the disaster, President Obama announced USAID sent supplies including food and emergency shelter. Because the president mentioned USAID, about 40 million people visited the agency’s website. If USAID had hosted its own site, it would have crashed. But the cloud service provider handled the traffic, Gowen says.

Our service provider can scale instantaneously to 40 million users, and when visitors drop off, we scale back,” he says. “It’s all handled.”

 

Such transitions are becoming commonplace. Improving storage management is a pillar of the government’s effort to optimize data centers. To meet requirements from the Federal Information Technology Acquisition Reform Act (FITARA), the Data Center Optimization Initiative requires agencies transition to cost-effective infrastructure.

While agencies are following different paths, the result is nearly identical: simpler and more efficient storage management, consolidation, increased reliability, improved service and cost savings. The U.S. Agency for International Development, for example, has committed to cloud storage.

“Our customers have different needs. The cloud allows us to focus on categorizing our data based on those needs like fast response times, reliability, availability and security,” says Lon Gowen, USAID’s chief strategist and special advisor to the CIO. “We find the service providers that meet those category requirements, and then we let the service providers focus on the details of the technology.”

To read the complete article click on the link below;

 

Read more

curata__iqhEaW8lW3S1SC7.png

Cloud Economics Drive the IT Infrastructure of Tomorrow

| CloudPost
curata__iqhEaW8lW3S1SC7.png

The cloud continues to dominate IT as businesses make their infrastructure decisions based on cost and agility. Public cloud, where shared infrastructure is paid for and utilized only when needed, is the most popular model today. However, more and more organizations are addressing security concerns by creating their own private clouds. As businesses deploy private cloud infrastructure, they are adopting techniques used in the public cloud to control costs. Gone are the traditional arrays and network switches of the past, replaced with software-defined data centers running on industry standard servers.

Features which improve efficiency make the cloud model more effective by reducing costs and increasing data transfer speeds. One such feature which is particularly effective in cloud environments is inline data reduction. This is a technology that can be used to lower the costs of data in transit and at rest. In fact, data reduction delivers unique benefits to each model of cloud deployment.

For the entire article please click on the link below;

Read more

curata__a3cyzZzwBmYye2g.png

Permabit exhibits VDO for Data Centers at Red Hat Summit 2017

| tmcnet.com
curata__a3cyzZzwBmYye2g.png

Permabit exhibits VDO for Data Centers at Red Hat Summit 2017 Permabit Technology Corporation will show its Permabit Virtual Data Optimizer (VDO) with patented deduplication, HIOPS Compression ® , and thin provisioning at Red Hat Summit 2017, held May 2-4 at the Boston Convention and Exhibition Center. Permabit will exhibit in the Storage Partner Showcase (Booth #206).

VDO is the only production-ready, modular data reduction solution for the Linux block storage stack. It is fully supported by Permabit on Red Hat Enterprise Linux and is compatible with Red Hat Ceph Storage and Red Hat Gluster Storage. As a ready-to-run kernel module for Linux, VDO works transparently with Linux block devices and file systems across all types of storage, as well as a broad range of open-source and commercial software solutions. This unique block-level approach enables Permabit customers to leverage existing file systems, volume management, and data protection, as well as deliver 4 KB inline, high-performance data reduction to Linux storage environments both on-premises and in the cloud.

VDO is used by the world’s largest financial and communications companies and large government agencies. VDO data reduction delivers dramatically better data density for public/private/hybrid cloud storage, reducing storage capacity consumption and service demands while delivering IT infrastructure operating costs that are up to six times lower.

Red Hat Summit is the premier open-source technology event to showcase the latest and greatest in cloud computing, platform, virtualization, middleware, storage, and systems management technologies. Thousands of attendees gain insight into topics such as big data, mobile, storage, Internet of Things, security, and DevOps in training sessions, technology sessions, hands-on labs, presentations, and professional networking opportunities.

For additional information on Red Hat Summit visit https://www.redhat.com/en/summit/2017

Read more

Permabit VDO on a Linux Laptop – Great Performance and 5:1 Space Savings

| By: (56)

  I get asked about VDO performance all the time and I’ve written several posts about big systems where we’ve seen spectacular performance numbers including 8 GB/s throughput and 650,000 mixed random 4 KB IOPS.  But what about performance on smaller systems for developers?  How about a laptop? A couple weeks ago I installed VDO version 6 on my Lenovo X230 laptop running Red Hat Enterprise Linux 7.3 and here’s a…

Read more

curata__auL8pOUjcauREKk.jpeg

Is the storage array on the endangered species list?

| By: (61)

A high-stakes game is playing out today as Amazon, Google, and Microsoft compete for leadership in cloud services markets that are projected to total in the hundreds of billions of dollars by 2020. In the last quarter alone, they spent a combined $9B to build out data centers to support the exploding cloud market (WSJ, 4/7/17). There is little question whether this triumvirate will be successful in their cloud efforts…

Read more

curata__93b34be0683b1a0c2be6a4d79eda1865.png

Embrace Efficiency

| BrightTALK
curata__93b34be0683b1a0c2be6a4d79eda1865.png

Organizations struggling to keep up with demand for private cloud storage are suddenly confronting the need for huge data center expansion which can run up to $3000/square foot.

Combining Permabit VDO with open source private cloud storage maximizes data center density, reducing the need for expansion while also lowering operational costs of power and cooling. Permabit Labs testing of VDO with unstructured data repositories on Red Hat Storage saw data reduction rate of 2:1. Permabit Labs Testing in Virtual Disk Image environments saw VDO compression and deduplication deliver up to 6:1 data reduction rates with Red Hat Storage.

View the video ……click the link below

Read more