Here is a shocker for you, backup IS a science. Good backup administrators / architects are worth their weight in gold. CIO’s just wish backup would go away. Backup costs money, it’s not strategic, it chews up man power and when it is 'running' (successfully or not) no one really pays attention to it, but when it fails or more likely when you need to restore data and can't, someone can lose their job - so backup is VERY important, it is a science and to architect a backup environment correctly it takes time, skill, money and someone who knows what they are dong.

Good backup administrators architect for recovery, not for backup. Prove it you say. Okay, question: “Why do backup administrators do full backups of Exchange every night?” Answer - because it is way easier and much faster to perform a one step full recovery for Exchange than it is to lay down the weekly full and apply the incrementals. Since mail is considered a “critical application” in the enterprise these days, and down time is critical for this application, good backup administrators architect for the least amount of downtime for the application. This also applies to databases. Ninety-five percent of all databases are actually snapped for quick recovery and I would also bet that a full backups is performed on them (or the snap) every evening.

Recovery is a primary driver of any good backup architecture but lately I have been hearing a great deal of talk around ‘backup consolidation’. The reality is, there is no ‘one size fits all’ when it comes to backup software or hardware. Consolidating backup software may make your environment easier to manage, but does it provide you the tools/technology you need to maximize your data protection objectives in your environment? Consolidating backup targets (tape / disk) may yield fewer devices to manage, but what happens to your overall backup and recovery performance when doing so? While new technologies may help fine-tune the science side of backup, they still need an artist’s touch.

An area where consolidation comes up quite frequently in the backup arena is around new data deduplication solutions. While these technologies add tremendous value, it should not be suggested that you forget about good backup architecture practices. For example, if deduplication is the removal of duplicate data, how much duplicate data is there really between your production data bases and your file systems within your company? Mixing the storage repository for your file system and data base data just doesn’t buy you a lot in your deduplicated backend so why mix them? It would make sense, however, to have a device / appliance for each database or set of databases that have common data as well as a device / appliance for file systems that have common data. Doing so would yield better backup and recovery performance and would probably mirror the same set of rules you would you used your ‘old’ backup environment. (Notice, I said ‘rules’ not devices or technologies.) Now as long as the cost isn’t exponentially higher having multiple devices (including management costs), recovery can be much easier and faster.

Another interesting side note, since most IT shops do FULL backups every night of their database, for the purpose of faster recover, then why wouldn't you want to have a dedicated backup storage device that does a 'full' backup every night of the data and only needs to move the changed data? This is the very nature of the Avamar technology and what this ‘next generation’ backup technology is designed to accomplish versus what traditional backup technologies try to do with cumbersome processes of full and incremental backups. Why not, for example, set up a dedicated Avamar Data Store for DB backups with the proper number of nodes for performance, and leave it at that?

Best Practices / Professional Services Have the Last Word

Instead of naysayers making a bunch of statements that certain technologies ‘can’t’ solve a problem, why wouldn’t they take a page out of a professional services handbook that says ‘if the solution is architected properly (and can be delivered at the right cost, and meet your business objectives) then there is no reason not to make any technology work to its maximum potential and solve difficult problems, that is the real science.

Ten years ago, backup administrators would say, “okay, if you can't get the backup / restore performance you need for that data set, then we will add another media server, get some more licenses and backup that data separately such that when you need to perform a restore, you can set up a dedicated media server for faster recovery.” Should this be any different today?

Backup is about recovery and more importantly performance (RTO) but it is also about architecture and a good backup architecture will put you on the Road to Recovery.

Tags:

Avamar, Backup, Data Deduplication, Recovery