So much data, so much of it big, and so many, many things.
IDC forecasts the Big Data market will increase to approximately $32 billion this year from just over $3 billion five years ago. Meanwhile, IDC predicts that the Internet of Things (IoT), which is fueling the explosion of data, will become a $1.7 trillion market by 2020, up from about $656 billion in 2014.
It’s been well documented how Big Data is transforming the way that companies operate and make business decisions in financial services, health care, retail, manufacturing, and a host of other industry sectors. IoT is a source of Big Data, and Big Data is where IoT information goes, lives, and is measured. The rapidly growing network of connected objects that collect and exchange data using embedded sensors now includes home thermostats and security systems, refrigerators and appliances, automobiles and traffic lights, and even vineyards and cows.
That’s right, your favorite Sonoma Cabernet and even the rib-eye you’ll barbeque this weekend may well have been enhanced by IoT and Big Data. So, the smart city, which, if you don’t already live in one, is coming soon to a neighborhood near you.
But that’s the raison d’être of the IoT: That billions of devices, animate and inanimate objects, are going to be connected to the internet to provide instantaneous feedback and intelligence for both the end user and the companies that are collecting Big Data for internal and external purposes.
MOUNTAINS OF DATA
Against this backdrop, a paradigm shift is occurring within the data center. On-premise IT is on the decline and colocation facilities are becoming increasingly dominant within the enterprise. In fact, the demand for data center colocation services is beginning to surpass supply and projected to reach more than $43 billion in revenue by next year, up from $25 billion in 2013, according to a report by MarketsandMarkets.
Given the mountains of unstructured information that is being collected by the enterprise, including both text and multimedia content, arguably one of the most lucrative use cases for colocation is to more effectively manage, store, and organize Big Data.
Considering the implications of Big Data on power, cooling, and the network in the data center, compounded with the rise of IoT both within and outside of the facility, the associated cost, and burden of management is significant. Rather than acting reactively when additional resources are needed within the data center or incur significant costs, colocation offers enterprises a flexible, advanced infrastructure in a leased environment. In turn, data center managers aren’t limited by the physical space of the company’s on-premise data center space.
WHY NOT THE CLOUD?
As the Internet of Things becomes more prevalent, we can expect an exponential increase in the amount of data companies process. And as the volume of Big Data expands, the limitations of public cloud platforms will become even more self-evident. In fact, the cloud is not particularly well-suited to run the sort of applications upon which Big Data depends. Big Data relies on the ability to move massive amounts of data very quickly and public cloud platforms are multi-tenant environments that rely on network attached storage. That’s an inherently poor design if the goal is to access reliably available compute resources and ultra-fast data transport.
For an enterprise that depends on Big Data for its revenue, the best option is to move servers, applications, and data into a top-tier colocation facility rather than relying on cloud platforms that offer limited insight or control of the underlying infrastructure. Because the technological and economic burden of lifting enormous amounts of data into the cloud is not a viable strategy, Etsy is a prime example of an organization that relies in large part on colocation to manage and analyze the enormous amounts of data produced by its eCommerce platform.
USING POWER WISELY
Rapid Big Data growth and the proliferation of IoT will accelerate the need not only for increasingly larger compute and storage requirements, and high-performance infrastructure, but also greater power density. High power density colocation facilities require fewer cabinets, which can save space and accommodate more data than their lower power density counterparts. That said, when Forsythe Technology forecasts that in three years the electricity needs of U.S. data centers alone will be six times that of New York City, it’s no surprise that power consumption, sustainability, and mitigating carbon footprint are top concerns for IT managers and colocation facility administrators.
Reinforced by the promise of rapid ROI, the world’s largest data centers and colocation facilities that have adopted data center infrastructure management (DCIM) solutions represent the vanguard of energy management best practices, embracing not only the business case, but also environmental responsibility.
DCIM tools are software and technology products that converge IT and building facilities functions to provide a holistic view of a data center’s performance to ensure that energy, equipment, and floor space are used as efficiently as possible. In large colos, where electricity billing comprises a large portion of the cost of operation, the insights these software platforms provide into power and thermal management can have an immediate and positive impact on an organization’s bottom line while supporting sustainability initiatives.
By offering increased levels of automated control, DCIM tools empower data center managers to receive timely information to manage capacity planning and allocations, as well as cooling efficiency. Providing detailed information about server power characteristics, for example, DCIM helps IT managers to set fixed-rack power envelopes that enable them to safely increase server count per rack, thereby improving utilization.
Another issue that DCIM platforms address is the burden of “zombie servers.” Counter-intuitively, servers can consume 50% of a data center’s power, even when those servers are idle, or so-called zombie servers. By one estimate, 10 million zombie servers worldwide use power roughly equivalent to eight large power plants. DCIM tools enable data center managers to slay the problem of zombie servers by readily identifying where equipment can be consolidated, thereby reducing energy consumption from 10% to 40%.
Additionally, by deploying thermal-management DCIM middleware, which offers simulations integrating real-time monitoring information to enable continuous validation of cooling strategy and air handling choices, colocation facilities can make improvements in airflow management to reduce energy consumption by as much as 40%.
Big Data, and the countless things that will inform it, will increasingly allow organizations to process information in ways that were once unimaginable. Colocation facilities, aided by DCIM software, will help ensure that the unimaginable is functional, sustainable, and responsible.
This article was originally published in missioncriticalmagazine.com and has been republished with permission of the author.