Cold Storage Can Cut Cloud Costs

Welcome to our first post in a short series about Cold Storage, a cloud storage strategy that can reduce costs for data infrequently accessed, yet still highly valuable to the growth of business. Cutting cloud cost is essential to companies, as Gartner recently predicting that associated cloud storage costs will be the bulk of IT spend by 2016. Not all data is the same, so the process for storing, accessing, and analyzing shouldn’t be either. We’ll walk through four requirements for cold storage, and how to best manage this process with cloud service providers.

-IT Peer Network Administrator

As enterprises and cloud service providers (CSPs) continue to experience dramatic data growth, private and public clouds using single high-performance storage tier for most data will lead to rapidly increasing storage costs. Much of the growing volume of information is older, “colder” data that is infrequently accessed. There’s considerable potential to reduce cloud infrastructure costs by moving this data to a lower-cost storage tier specifically designed for infrequent access to cold data.

Although cold data is infrequently accessed, it is still incredibly valuable. Businesses are increasingly investing in “big data” analytics to identify customer and operational trends. Cold storage must therefore provide the performance and capabilities required to enable analysis. Even if businesses are not analyzing historical data today, they may well need to do so in the future to remain competitive.

Some CSPs have introduced cold storage services that offer lower data storage costs and correspondingly lower performance levels. In other cases, cloud services are implementing cold storage behind the scenes; for example, they may automatically move older and less frequently accessed data to a lower-performance tier.

Four interrelated requirements are relevant to most cold storage usage models. It’s increasingly common to quantify and explicitly specify one or more of these requirements in cold storage service-level agreements (SLAs) between the customer and the CSP. Those four requirements include:

  • Expected storage life. Cold storage is designed for persistent rather than transient data. This is reflected in the cold storage SLA, which is triggered by the fact that the data is considered important enough to retain and therefore requires long-term storage.
  • Access frequency. As data ages, it tends to be less frequently accessed and therefore becomes more suited to cold storage. This may be explicitly specified in the SLA: data is moved to cold storage based on the date and time it was last accessed.
  • Access speed. Cold storage explicitly assumes that lower performance is acceptable for older data. The SLA defines which data is needed immediately and which can wait.
  • Cost. The benefit of cold storage is the reduced cost of storing older and less frequently accessed data. For some usage models, this overrides any other considerations. Meeting the SLA requires use of the lowest cost infrastructure. Everything else is secondary.

If the data in question passes the four outlined requirements, the next step is to determine which cold storage model is most appropriate. Look for our next post on the four different cold storage models (Backup, Disaster Recovery, Archiving & Social Media) to learn how each model has its own set of challenges.

For longer read about cold storage and the different requirements and models, check out Cold Storage in the Cloud: Trends, Challenges and Solutions. And to see how the Intel Atom Processor can optimize such storage, look over this brief.

Have you adopted a cold storage strategy for your company? If so, what challenges and benefits have you found? Leave a comment below or join the social discussion on Twitter by using the hashtags #ITCenter and #cloudstorage.