Federal Agencies Optimize Data Centers by Focusing on Storage using Data Reduction
In data centers, like any piece of real estate, every square foot matters.
“Any way we can consolidate, save space and save electricity, it’s a plus,” says the State Department’s Mark Benjapathmongkol, a division chief of the agency’s Enterprise Server Operation Centers.
In searching out those advantages, the State Department has begun investing in solid-state drives (SSDs), which provide improved performance while occupying substantially less space in data centers.
In one case, IT leaders replaced a disk storage system with SSDs and gained almost three racks worth of space, Benjapathmongkol says. Because SSDs are smaller and denser than hard disk drives (HDDs), IT staff don’t need to deploy extra hardware to meet speed requirements, resulting in massive space and energy savings.
Options for Simplifying Storage Management
Agencies can choose from multiple technology options to more effectively and efficiently manage their storage, says Greg Schulz, founder of independent analyst firm Server StorageIO. These options include: SSDs and cloud storage; storage features such as deduplication and compression, which eliminate redundancies and store data using less storage; and thin provisioning, which better utilizes available space, Schulz says.
Consider the Defense Information Systems Agency. During the past year, the combat support agency has modernized its storage environment by investing in SSDs. Across DISA’s nine data centers, about 80 percent of information is stored on SSD arrays and 20 percent is running on HDDs, says Ryan Ashley, DISA’s chief of storage.
SSDs have allowed the agency to replace every four 42U racks with a single 42U rack, resulting in 75 percent savings in floor space as well as reduced power and cooling costs, he says.
Deduplication Creates Efficiencies
Besides space savings and the fact that SSDs are faster than HDDs, SSDs bring additional storage efficiencies. This includes new management software that automates tasks, such as the provisioning of storage when new servers and applications are installed, Ashley says.
The management software also allows DISA to centrally manage storage across every data center. In the past, the agency used between four to eight instances of management software in individual data centers.
“It streamlines and simplifies management,” Ashley says. Automatic provisioning reduces human error and ensures the agency follows best practices, while central management eliminates the need for the storage team to switch from tool to tool, he says.
DISA also has deployed deduplication techniques to eliminate storing redundant copies of data. IT leaders recently upgraded the agency’s backup technology from a tape system to a disk-based virtual tape library. This type of approach can accelerate backup and recovery and reduce the amount of hardware needed for storage.
It also can lead to significant savings because DISA keeps backups for several weeks, meaning it often owns multiple copies of the same data. But thanks to deduplication efforts, the agency can store more than 140 petabytes of backup data with 14PB of hardware.
“It was a huge amount of floor space that we opened up by removing thousands of tapes,” says Jonathan Kuharske, DISA’s deputy of computing ecosystem.
Categorize Data to Go Cloud First
To comply with the government’s “Cloud First” edict, USAID began migrating to cloud services, including infrastructure and software services, about seven years ago.
Previously, USAID managed its own data centers and tiered its storage. But the agency moved its data to cloud storage three years ago, Gowen says, allowing USAID to provide reliable, cost-effective IT services to its 12,000 employees across the world. The agency, which declined to offer specific return on investment data, currently uses a dozen cloud providers.
“We carefully categorize our data and find service providers that can meet those categories,” says Gowen, noting categories include availability and security. “They just take care of things at an affordable cost.”
For its public-facing websites, the agency uses a cloud provider that has a content distribution network and can scale to handle sudden spikes in traffic.
In late 2013, a typhoon lashed the Philippines, killing at least 10,000 people. In the days following the disaster, President Obama announced USAID sent supplies including food and emergency shelter. Because the president mentioned USAID, about 40 million people visited the agency’s website. If USAID had hosted its own site, it would have crashed. But the cloud service provider handled the traffic, Gowen says.
“Our service provider can scale instantaneously to 40 million users, and when visitors drop off, we scale back,” he says. “It’s all handled.”
Such transitions are becoming commonplace. Improving storage management is a pillar of the government’s effort to optimize data centers. To meet requirements from the Federal Information Technology Acquisition Reform Act (FITARA), the Data Center Optimization Initiative requires agencies transition to cost-effective infrastructure.
While agencies are following different paths, the result is nearly identical: simpler and more efficient storage management, consolidation, increased reliability, improved service and cost savings. The U.S. Agency for International Development, for example, has committed to cloud storage.
“Our customers have different needs. The cloud allows us to focus on categorizing our data based on those needs like fast response times, reliability, availability and security,” says Lon Gowen, USAID’s chief strategist and special advisor to the CIO. “We find the service providers that meet those category requirements, and then we let the service providers focus on the details of the technology.”
To read the complete article click on the link below;