With all the buzz around data protection and cloud something seems to have been lost in space. We talk about instant, incremental and online backups as well as data deduplication, but what really needs to be the focus is recovery. Right?
All of the fancy bells and whistles of today’s products mean nothing if you cannot get the data back. On top of that, the data must come back and be useable. You may back up 5 terabytes in 30 seconds, which is impressive, but if you cannot recover that data and use it, then the point is moot. Why? Because the data does you no good if it cannot be used after a failure.
That leaves us back at the beginning. While backup is important, recovery is key. But (there is always a but, right?) the kicker is that data has to be recovered not only from backups taken last night or a week ago, but also from last year or 15 years ago.
And that is the rub. How do you make sure your data is recoverable? Do you test it often? Do you test every backup with a mock restore? Do you monitor every bit, block and byte that is protected? Doubtful. (If you do you have way too much time on your hands). That begs the question, “How do I make sure I can recover my data?”
There are three main ways to do this. The first is simple and easy, but time consuming. Spot-check your data every week or so to make sure that your data can be recovered. For example, a restore of some of your production servers into a test/development environment, where the data can then be used to make sure it is viable. This eats up the clock and takes extra man-hours, so very few take the time. Disaster recovery testing is another example of this.
The second is to audit your backups. The best data protection applications allow you to store multiple copies of the data in multiple types of media, which can be audited to make sure the data is still viable. This is crucial for long-term data storage, as once data is written and kept for more than a year there is a risk that the original source of the data is no longer in existence. That is where an audit comes in handy. The ability to log into your data protection application and scan the backup media to make sure all of the checksums and meta data match with the original backup/archive. If they do not, then the second copy of the data is used to rebuild that backup media. This is important for data that is kept for longer periods of time.
The third is simple, but you never see it. As a matter of fact, a company would have to believe in its product so much that it is willing to place the entire business on the line for it. What am I talking about? A vendor that guarantees it can recover your data. Those rarely exist in the marketplace today, so this simple idea is quite complex. A company that does this would have to have a product that is so robust and well built that the recovery of data is fast, simple and guaranteed. Who in its right mind would do this?
STORServer is now happy to provide that peace of mind. The company is the only vendor in the market that sells a product with a data recovery guarantee.
Can your vendor guarantee your recovery?