Cloud Computing — New or Recycled Idea?

If you follow the IT industry – you can’t escape the “cloud”. Whether online articles, industry seminars, and blogs – the hype over cloud computing is everywhere. And don’t expect it to die down in 2009.

Yet amidst all the hype – there are still a lot of questions and confusion about the “cloud”. At Intel – we get asked a lot about cloud computing, and one of the top questions is: “Is cloud computing really new?”

The answer is not as clear-cut as it may seem.

First – what is “cloud computing” anyway? There are many industry definitions, many very useful and some not as good. Some pundits want to label everything the cloud, while others have intricate and nuanced definitions where very little could be considered cloud computing.

Intel has it own view of the cloud – centered, not surprisingly, on the architecture providing the cloud processing, storage, and networking. This “cloud architecture” is characterized by services and data residing in shared, dynamically scalable resource pools. Since so much of the cloud’s capabilities – and its operational success – depend on the cloud’s architecture – it makes sense to begin the definition there.

A cloud architecture can be used in essentially two different ways. A “cloud service” is a commercial offering that delivers applications (e.g., Salesforce CRM) or virtual infrastructure for a fee (e.g., Amazon’s EC2). The second usage model is an “enterprise private cloud” -- a cloud architecture that’s for internal use behind corporate firewall, designed to deliver “IT as a service”.

Cloud computing – both internal and external – offers the potential for highly flexible computing and storage resources, provisioned on demand, at theoretically lower cost than buying, provisioning, and maintaining more fixed equivalent capacity. 

So now that we’re grounded on our terminology… we return to this question of the cloud being new or just repackaged concepts from an earlier era of computing.

Turns out that it’s both: cloud architectures do represent something new – but they build on so many critical foundations of technology and service models that you can’t argue the cloud is an earth-shattering revolution. It’s an exciting, new but evolutionary shift in information technology.

The rich heritage of cloud computing starts with centralized, shared resource pooling – a concept that dates back to mainframes and the beginning of modern computing.  A key benefit of the mainframe is that significant processing power becomes available to many users of less powerful client systems. In some ways, datacenters in the cloud could offer similar benefits, by providing computing or applications on demand to many thousands of devices.  The difference is that today’s connected cloud clients are more likely to be versatile, powerful devices based on platforms such as Intel’s Centrino, which give users a choice: run software from the cloud when it makes sense, but have the horsepower to run a range of applications (such as video or games) that might not perform well when delivered by the “mainframe in the cloud”.

Another contributing technology for the cloud is virtualization. The ability to abstract hardware and run applications in virtual machines isn’t particularly new – but abstracting entire sets of servers, hard drives, routers and switches into shared pools is a relatively recent, emerging concept. And the vision of cloud computing takes this abstraction a few steps further – adding concepts of autonomic, policy driven resource provisioning and dynamic scalability of applications. A cloud need not leverage a traditional hypervisor / virtual machine architecture to create its abstracted resource pool; a cloud environment may also be deployed with technologies such Hadoop – enabling applications to run across thousands of compute nodes. (Side note: if you’re interested in open source cloud environments, you might check out the OpenCirrus project at – formed by collaboration between Intel, HP, and Yahoo.)

The key point here is that just because it’s an abstracted, shared resource – doesn’t mean it’s necessarily a cloud. Otherwise a single server, running VMWare and a handful of IT applications, might be considered a cloud. What makes the difference? It’s primarily the ability to dynamically and automatically provision resources based on real-time demand.

What about grid computing? Indeed – if you squint – a grid environment looks considerably like what we’ve defined as a cloud. It’s not worth getting into a religious argument over grid versus cloud – as that’s already been done elsewhere in the blogosphere. Grids enable distributed computing across large numbers of systems – and so the defining line of what constitutes grid and cloud is blurry. In general cloud architectures may have an increased level of multi-tenancy, usage based billing, and support for a greater variety of application models.

Finally – one of the key foundations of cloud computing isn’t really a technology at all, but rather the “on demand” service model. During the dot-com boom, the “application service provider” sprung up as a novel way to host and deliver applications – and they are the direct forefathers of today’s Software as a Service (SaaS) offerings. One of the ways “on demand” continues to evolve is in the granularity of the service and related pricing. You can now buy virtual machines – essentially fractions of servers – by the hour. As metering, provisioning, and billing capabilities continue to get smarter, we’ll be able to access cloud computing in even smaller bites… buying only precisely what we need at any given moment.

So to wrap up – the cloud is truly a new way of delivering business and IT services via the Internet, as it offers the ability to scale dynamically across shared resources in new and easier ways. At the same time - cloud computing builds on many well-known foundations of modern information technology, only a few of which were mentioned here. Perhaps the most interesting part of the cloud’s evolution is how early we are in its development.