51 percent of workloads are now in the cloud, time to break through that ceiling?
At this point, we’re somewhat beyond discussions of the importance of cloud. It’s been around for some time, just about every person and company uses it in some form and, for the kicker, 2014 saw companies place more computing workloads in the cloud (51 percent) — through either public cloud or colocation — than they process in house.
In just a few years we’ve moved from every server sitting in the same building as those accessing it, to a choice between private or public cloud, and the beginning of the IT Model du jour, hybrid cloud. Hybrid is fast becoming the model of choice, fusing the safety of an organisation’s private data centre with the flexibility of public cloud. However, in today’s fast paced IT world as one approach becomes mainstream the natural reaction is to ask, ‘what’s next’? A plausible next step in this evolution is the end of the permanent, owned datacentre and even long-term co-location, in favour of an infrastructure entirely built on the public cloud and SaaS applications. The question is will businesses really go this far in their march into the cloud? Do we want it to go this far?
Public cloud, of course, is nothing new to the enterprise and it’s not unheard of for a small business or start-up to operate solely from the public cloud and through SaaS services. However, few, if any, examples of large scale corporates eschewing their own private datacentres and co-location approaches for this pure public cloud approach exist.
For such an approach to become plausible in large organisations, CIOs need to be confident of putting even the most sensitive of data into public clouds. This entails a series of mentality changes that are already taking place in the SMB. The cloud based Office 365, for instance, is Microsoft’s fastest selling product ever. For large organisations, however, this is far from a trivial change and CIOs are far from ready for it.
The Data Argument
Data protectionism is the case in point. Data has long been a highly protected resource for financial services and legal organisations both for their own competitive advantage and due to legal requirements designed to protect their clients’ information. Thanks to the arrival of big data analysis, we can also add marketers, retailers and even sports brands to that list, as all have found unique advantages in the ability to mine insights from huge amounts of data.
This is at the same time an opportunity and problem. More data means more accurate and actionable insights, but that data needs storing and processing and, consequently, an ever growing amount of server power and storage space. Today’s approach to this issue is the hybrid cloud. Keep sensitive data primarily stored in a private data centre or co-located, and use public cloud as an overspill when processing or as object storage when requirements become too much for the organisation’s existing capacity.
The amount of data created and recorded each day is ever growing. In a world where data growth is exponential, the hybrid model will be put under pressure. Even organisations that keep only the most sensitive and mission critical data within their private data centres whilst moving all else to the cloud will quickly see data inflation. Consequently, they will be forced to buy ever greater numbers of servers and space to house their critical data at an ever growing cost, and without the flexibility of the public cloud.
In this light, a pure public cloud infrastructure starts to seem like a good idea - an infrastructure that can be instantly switched on and expanded as needed, at low cost. The idea of placing their most sensitive data in a public cloud, beyond their own direct control and security, however, will remain unpalatable to the majority of CIOs. Understandable when you consider research such as that released last year stating that only one in 100 cloud providers meets EU Data Protection requirements currently being examined in Brussels.
So, increasing dependence on the public cloud becomes a tug of war between a CIO’s data burden and their capacity for the perceived security risk of the cloud.
The process that may well tip the balance in this tug of war is cloud’s very own version of exposure therapy. CIOs are storing and processing more and more non-critical data in the public cloud and, across their organisations, business units are independently buying in SaaS applications, giving them a taste of the ease of the cloud (from an end user point of view, at least). As this exposure grows, the public cloud and SaaS applications will increasingly prove their reliability and security whilst earning their place as invaluable tools in a business unit’s armoury. The result is a virtuous circle of growing trust of public cloud and SaaS services – greater trust means more data placed in the public cloud, which creates greater trust. Coupled with the ever falling cost of public cloud, eventually, surely, the perceived risks of the public cloud fall enough to make its advantages outweigh the disadvantages, even for the most sensitive of data?
Should It Be Done?
This all depends on a big ‘if’. Trust in the public cloud and SaaS applications will only grow if public cloud providers remain unhacked and SaaS data unleaked. This is a big ask in a world of weekly data breaches, but security is relative and private data centre leaks are rapidly becoming more common, or at least better publicised, than those in the public cloud. Sony Pictures’ issues arose from a malevolent force within its network, not its public cloud based data. It will take many more attacks such as these to convince CIOs that losing direct control of their data security and putting all that trust in their cloud provider is the most sensible option. Those attacks seem likely to come, however, and in the meantime, barring a major outage or truly headline making attack on it, cloud exposure is increasing confidence in public cloud.
At the same time, public cloud providers need to work to build confidence, not just passively wait for the scales to tip. Selecting a cloud service is a business decision and any CIO will lend the diligence that they would any other supplier choice. Providers that fail to meet the latest regulation, aren’t visibly planning for the future or fail to convince on data privacy concerns and legislation will damage confidence in the public cloud and actively hold it back, particularly within large enterprises. Those providers that do build their way to becoming a trusted partner will, however, flourish and compound the ever growing positive effects of public cloud exposure.
As that happens, the prospect of a pure public cloud enterprise becomes more realistic. Every CIO and organisation is different, and will have a different tolerance for risk. This virtuous circle of cloud will tip organisations towards pure cloud approaches at different times, and every cloud hack or outage will set the model back different amounts in each organisation. It is, however, clear that, whether desirable right now or not, pure public cloud is rapidly approaching reality for some larger enterprises.