Cloud 2012: More Data. More Efficiency. More Devices.

Please note: This blog originally apeard as a sponsored blog post in the Cloud area on

I often get asked for my thoughts on cloud computing and other data center trends. While I'll stop short of calling anything a prediction, I can tell you what is top of mind for me and many of my colleagues this year.

Unrelenting data growth will continue.

There's no stopping data growth. IDC predicts that by 2015, the amount of information managed by enterprise data centers will grow by a factor of 50, and the number of files the data center will have to deal with will grow by a factor of 75. Mobile data traffic alone will increase 26 times between 2010 and 2015, reaching 6.3 exabytes per month by 2015, when nearly 70 percent of Internet users will use more than five network-connected devices.

As enterprises face an avalanche of data triggered by social media, application growth, and a proliferation of mobile devices, they need cost-effective ways to turn bits and bytes into meaningful information. Moreover, with 15 billion connected devices by 2015, the amount of data for manufacturing, retail, supply chain, smart grid, and many other applications will require new approaches to both batch and real-time analytics. Driven by this need, many organizations are developing distributed analytics platforms based on frameworks such as Hadoop.

An open-source framework for the distributed processing of large datasets across server clusters, Hadoop enables fast performance for complex analytics through massively parallel processing. It also allows database capacity and performance to be scaled incrementally through the addition of more server and storage nodes. This approach is not without challenges as the usability and scalability of distributed analytics frameworks currently inhibit broad adoption.

Best practices will drive efficiency gains.

Cloud computing is one of the keys to dealing with massive amounts of data in a cost-effective manner while creating a more agile IT infrastructure.

That's the case at Intel IT, where our enterprise private cloud is up and running and has already realized $9 million in net savings to date. More than 50 percent of our servers are now virtualized. We've reduced provisioning time from 90 days to 3 hours, and we see the day coming where provisioning will take place in minutes.

Efficiency isn't important only in the software and compute layers; it's also a focus for best practices at the infrastructure and facility levels. One such best practice is high ambient temperature (HTA) data center operation. HTA raises the operating temperature within a data center to decrease operational and capital costs for cooling and enable energy savings to be used to power servers.

Unfortunately, however, it's not as simple as turning off the air-conditioning. The system design, rack and facility controls, and even technology component choices are critical and part of the reason that we've developed a blueprint of best practices that we share openly.

Client-aware computing will become essential.

In response to the proliferation and wide-array of devices, client-aware computing will be key focus in cloud data centers. In a client-aware environment, cloud-based applications both recognize and take advantage of the capabilities of the client device.

Rather than providing services that are dumbed down to a lowest common denominator-or the capabilities of the most basic client devices-the cloud service adapts to deliver optimal service based on the device at hand, making full use of the capabilities of both the client and the server. Understanding the compute, graphics, battery life, security, and other attributes of the device can greatly improve the user experience while efficiently using data center and network bandwidth.

Technology refresh will reinvigorate data centers.

Organizations will refresh data center technology to pack more computing power into each square foot, drive down power and cooling costs, and increase the security of data and applications.

With those goals in mind, I'm excited by the technology we're delivering in our new Intel® Xeon® processor E5 platforms. We're introducing new technology for the performance and scale of big data, power management for data center efficiency, and Intel® Trusted Execution Technology (TXT) to address some of the security requirements of cloud datacenters.

I believe 2012 is going to be a year of tremendous growth and innovation enabled by cloud computing. At Intel, we are thrilled to be a part of it.

Follow @IntelITS for more news.

Published on Categories Cloud ComputingTags ,
Jason Waxman

About Jason Waxman

Jason is corporate vice president of the Data Center Group and general manager of the Data Center Solutions Group at Intel Corporation. He manages Intel’s business, products and technologies for cloud service providers, a rapidly growing data center business segment. Waxman oversees the company’s technology development for cloud computing, including silicon components, optimized system design, data center management, security and facility optimization.