wpid-curata__e398f06b7eb3514189541090ca9f3b89.jpg

SANblox: Unchain Your Storage From the Vendor Roadmap

| By: (48)

If you’ve been following Permabit for long, you might observe that SANblox is a completely new way we’re making our Albireo data efficiency technologies available in the market.  We’ve traditionally delivered our deduplication and compression as libraries and storage modules that are integrated by our partners – some of the largest storage vendors in the world – to deliver high-performance optimization in their existing and future platforms.  Permabit is still…

Read more

wpid-curata__6731a6fc3e5a9a38b963e02dde647d62.png

SANblox Brings New Life to Old SANs

| By: (48)

Over the last week I’ve been thrilled to see the enthusiastic response to the launch of our SANblox data efficiency appliance, the first complete solution for deduplication and compression in enterprise SAN.  SANblox represents the culmination of more than a decade of Permabit research into deduplication, and brings the power of our integrated Albireo technologies into a ready-to-run appliance that can be deployed in a few hours. We’ve written quite…

Read more

wpid-curata__068c2332b2d9613146601574155f3dd1.PNG

Getting to Primary Storage – Performance: The Key to Data Efficiency

| By: (48)

This is the fourth in a four part series on Performance: The Key to Data Efficiency Inline Efficiency Inline deduplication and compression – eliminating duplicates as they are written, rather than with a separate process that examines data hours (or days) later – is an absolute requirement for performance in the primary storage market, just as we’ve previously seen in the backup market.  By operating in an inline manner, efficiency…

Read more

wpid-curata__068c2332b2d9613146601574155f3dd1.PNG

Hits and Misses; What We Learned – Performance: The Key to Data Efficiency

| By: (48)

This is the third in a four part series on Performance: The Key to Data Efficiency Hits and Misses (and Mostly Misses) If we take a look at primary storage systems shipping with some form of data efficiency today, we see that the offerings are largely lackluster.  The reason that offerings with efficiency features haven’t taken the market by storm is because they deliver the same thing as less successful…

Read more

wpid-curata__068c2332b2d9613146601574155f3dd1.PNG

Evolution and Lessons Learned – Performance: The Key to Data Efficiency

| By: (48)

This is the second in a four part series on Performance: The Key to Data Efficiency The Evolution of Data Efficiency In less than ten years, data deduplication and compression shifted billions of dollars of customer investment from tape-based backup solutions to purpose-built disk-based backup appliances. The simple but incomplete reason for this is that these technologies made disk cheaper to use for backup.  While this particular aspect enabled the…

Read more

wpid-curata__068c2332b2d9613146601574155f3dd1.PNG

What are Data Efficiency Technologies? – Performance: The Key to Data Efficiency

| By: (48)

This is the first in a four part series on Performance: The Key to Data Efficiency  Data efficiency – the combination of technologies including data deduplication, compression, zero elimination and thin provisioning – transformed the backup storage appliance market in well under a decade.  Why has it taken so long for the same changes to occur in the primary storage appliance market?  The answer can be found by looking back…

Read more

wpid-curata__a82c63a7a9c6b8a5947c33863bb98abd.PNG

How HIOPS Works: Compression Full Speed Ahead

| By: (48)

We’re very proud this week to introduce HIOPS compression for Albireo VDO, a top-performance optimization technology that represents the culmination of more than a year of work for a big part of our development team.  After delivering the highest performance deduplication in the market, we set out to do the same for compression so that we can save our customers money across an even broader variety of use cases.  It…

Read more

Preventing Performance Bottlenecks with Inline Deduplication

| By: (48)

Implementing high performance in enterprise storage is a constant battle to find and eliminate the next system bottleneck.  Normally this alternates between limits of the underlying media and the computational overhead of metadata management, but choosing the wrong approach to deduplication can introduce a third performance challenge that can be impossible to overcome.  Storage that implements a multi-pass approach to data optimization, such as staged or post-process deduplication, becomes inherently…

Read more

CATEGORIES: Jered Floyd, CTO

The Performance Challenges of Data Optimization – Data Deduplication

| By: (48)

Data deduplication is like the big brother to lossless compression; it also depends on identifying redundant stretches of data, but does it across a much larger pool of data. While compression identifies duplicates of a few bytes within a file, deduplication locates much larger duplicate chunks, perhaps 4 KB or more, across the entire pool of storage, thus working across files, or even file systems and LUNs. This provides the…

Read more

CATEGORIES: Jered Floyd, CTO

The Performance Challenges of Data Optimization – Data Compression

| By: (48)

Unlike thin provisioning, compression allows for space savings on storage actually in use for data, and different variants can operate at both block and file storage levels.  Compression technologies fall broadly into two categories: lossless and lossy.  Lossless compression works by identifying redundancies within a short stretch of data, such as a file or block, and eliminating those duplicate parts.  As the name implies, all of the original data is…

Read more

CATEGORIES: Jered Floyd, CTO