Comprehensive Capacity Optimization - Deduplication 2.0

Technology is great isn't it? When someone thinks they have a new idea on the same old technology foundation they call it "X 2.0". I have been watching the banter between analysts and vendors (specifically NTAP’s Dr. Dedupe and Permabit’s CEO Tom Cook) on the topic of Deduplication 2.0 and it is my belief that the proverbial boat is being missed (since we are using water analogies). I have been watching these guys hash it… Read more »

The Side Effects of Backup on Server Virtualization

Server virtualization has changed the IT landscape dramatically. It has become a magic potion curing a number of ills in the physical server world such as low individual CPU utilization and excess use of space, power and cooling in the data center. However, like all potions that cure what ails you, there can be side effects. You need to be careful of what the Witch Doctor orders.

When I speak with customers who have aggressively… Read more »

A Data Protection Reference Architecture - The Final Chapter

The Architecture

This ‘architecture’ diagram, as you can see, is not a typical architecture diagram, but hopefully it can be used to align your business and business objectives with the technologies that are available and can best be applied to solve your issues helping to balance, cost, complexity and compliance.

This diagram can also be used to do a couple of other things. It can help you begin to classify your data and align your… Read more »

A Data Proteciton Reference Architecture - Part 3

The 'Fat Middle'

In the 'fat middle' of the triangle, as I stated last week, there are a number of ways to protection information. I have chosen to break apart the middle into two categories. The reality is, this is meant to be used as a tool for helping you lay out a strategy so your boxes could be based on capacity and could end up in different areas of the triangle depending upon your… Read more »

A Data Protection Reference Architecture – Part 2

Archive

The most fundamental part of developing a good data protection architecture starts at the base of the triangle with Archive. Archive is often an overlooked component of data protection - It’s not just for regulated business anymore. Archive essentially gives users 100% data deduplication efficiency. What I mean by this is that you have the ability to remove ‘stale’ data (and by 'stale' I don't mean unimportant data, I just mean data that is… Read more »