Cloud snapshots are permanent


Compared with local snapshots that are used for temporary purposes, like facilitating the backup process, cloud snapshots are used to maintain a permanent copy of a data set; in that respect they are the backup. A new generation of storage services is being delivered through appliances that maintain a local cache for a full data set stored in the cloud.


Snapshots are responsible for moving data efficiently between a customer’s data center and the cloud. Each snapshot represents an incremental change that may be needed to recreate some part of the original data set. In these change-based architectures, the past is used to recreate the present. Older data provides the foundation for current data and like a block building, deleting an older snapshot without doing the appropriate bookkeeping can cause problems with the ‘structural integrity’ of the entire data set.



Compliance and data management


Deletion is sometimes mandated by regulations and other times driven by internal company policies aimed at limiting legal liability. Retention policies for data are a requirement in more and more organizations. Legal guidelines for keeping certain data sets exist in all companies, even those outside of the financial and healthcare sectors, which traditionally come to mind. For example, every company that issues paychecks has personal data on its employees and must comply with regulations surrounding how it’s kept, and deleted. Control over data’s lifecycle is a requirement for these reasons, but it’s also a prudent management practice. For many companies this means setting retention policies for data when it’s created and deleting it when that time expires. Having data ‘live on’ in an old cloud snapshot breaks regulatory compliance rules and is counter to sound corporate data management.



Cloud is not easily deleted


Unlike traditional snapshots, snapshots to the cloud are encrypted on-premises and cannot be modified. Once data is sent to the cloud, it will be replicated to multiple server and geographic locations so as to ensure disaster protection. The good news is that these snapshots are not in any way volatile and the service provider guarantees their permanent, safe storage. The bad news is that sometimes deleting the snapshots is exactly what’s needed.


Deleting data in the cloud is only possible when the data has been encrypted prior to it being sent to the cloud. It then becomes a matter of bookkeeping to eliminate the parts of snapshots that are no longer needed or wanted to restore deleted data.



Control cloud costs


Minimizing storage expenses and management of storage costs are key drivers for cloud adoption in many companies and the ability to outsource the storage infrastructure along with cloud’s ‘pay as you use it’ model is very appealing. But “forever” is a long time, and storage costs can mount up, even with efficient storage systems, like snapshot based cloud gateways that de-duplicate and compress data. Since data is essentially never deleted from the cloud, this unchecked growth of snapshots can undermine a primary benefit of this technology. Like a recurring expense, even a small increase in capacity consumed as data grows, is magnified when it’s paid for, again, every month.



Permanent deletion from the cloud


People like to know they can get rid of something, even if there aren’t any formal regulations or company policies involved. The idea that even their ‘scratch space’ will be saved may be a little disconcerting for some. In response to these desires, companies like Nasuni, which provide storage services and manufactures the Nasuni Filer storage controller, have added the ability for users to set retention policies on snapshots. In this way, companies can cycle through expired data and have it permanently removed from the cloud, providing an assurance of destruction for compliance and a way to keep rising cloud capacity costs in check.


This is typically accomplished by having the controller delete snapshots older than a specified time frame or delete the oldest snapshots after a specified number of have been saved. For example, credit card information is commonly kept for seven years. These data could be stored in a volume with retention properties of seven years. Anything put into that volume would begin aging when first saved, then destroyed as it expires.


To many users, cloud storage initially presented a security concern. In response, cloud providers and device manufacturers have provided encryption for data before it leaves the user’s data center. Now, ironically, the process of deleting data like old snapshots can present the same concern that users had about saving data. They don’t for sure know if their data is really gone when the cloud provider shows that it’s no longer available. In this way, compliance has added another cause for concern as some regulations specify data destruction and the deletion of cloud-based snapshots can’t be guaranteed.


Products like the Nasuni Data Protection service address this concern by leveraging on-board data encryption. Each change in the contents of the file system contents are captured at the sub-file level and sent to the cloud in a snapshot, an encrypted snapshot. By maintaining encryption keys at the edge, on customers’ premises, Nasuni guarantees that data has been destroyed by deleting the keys for that specific snapshot. At that point it’s immaterial whether the cloud has physically deleted the data - it’s unreadable to anyone, including the user.


The value of the cloud to provide scalable, flexible, affordable storage is clear. With technologies like snapshots, cloud storage has addressed most of the concerns raised by the market, like bandwidth and security. But a new question around data destruction and the requirement to delete cloud snapshots is being asked by companies that have compliance and data governance requirements. In response, the new storage services are now providing snapshot retention control, and enabling users to be sure that “delete” means delete.

Nasuni is a client of Storage Switzerland

Eric Slack, Senior Analyst